Export to Hiveserver2 fails

2 followers
0
Avatar

We tried to export  to a Hiveserver2 on the same cluster (DAS Version: 6.1.24, Revision: c4413ff9dc94a0759fa71e037abac74429d3d052, Hadoop-Distribution: 2.6.0-cdh5.11.0 (cdh-5.11.0), JVM: 1.8) and get following Exception:

 INFO [2017-09-19 17:17:51.861] [JobScheduler thread-1] (JobScheduler.java:396) - Starting job 1492 (DAS Version: 6.1.24, Revision: c4413ff9dc94a0759fa71e037abac74429d3d052, Hadoop-Distribution: 2.6.0-cdh5.11.0 (cdh-5.11.0), JVM: 1.8)
INFO [2017-09-19 17:17:51.863] [JobScheduler thread-1] (NormalJobDriver.java:124) - Checking if JobExecutionValueObject{_id=1492} can be started
INFO [2017-09-19 17:17:51.884] [JobScheduler thread-1] (JobScheduler.java:430) - [Job 1492] Preparing job in job scheduler thread for CustomDataSink{id=247}...
INFO [2017-09-19 17:17:51.884] [JobScheduler thread-1] (JobScheduler.java:433) - [Job 1492] Preparing job in job scheduler thread for CustomDataSink{id=247}... done (0 sec)
INFO [2017-09-19 17:17:51.886] [JobScheduler worker1-thread-433] (JobSchedulerJob.java:89) - [Job 1492] Preparing job for CustomDataSink{id=247}...
INFO [2017-09-19 17:17:51.903] [JobScheduler worker1-thread-433] (JobSchedulerJob.java:94) - [Job 1492] Preparing job for CustomDataSink{id=247}... done (0 sec)
INFO [2017-09-19 17:17:51.910] [JobScheduler worker1-thread-433] (JobSchedulerJob.java:115) - Starting job ...
INFO [2017-09-19 17:17:51.927] [JobScheduler worker1-thread-433] (MrPlanRunnerV2.java:81) - Allow running Datameer job with up to 1 concurrent cluster jobs.
INFO [2017-09-19 17:17:51.934] [MrPlanRunnerV2] (JobExecutionTraceService.java:82) - Creating local job execution trace log at /app/datameer/Datameer-6.1.24-cdh-5.11.0/temp/cache/dfscache/local-job-execution-traces/1492
INFO [2017-09-19 17:17:51.936] [MrPlanRunnerV2] (TezClusterSession.java:43) - Creating a TEZ job for session Export job (1492): TwitterAnalysisExport with a job count 1
INFO [2017-09-19 17:17:51.936] [MrPlanRunnerV2] (ClusterJobFlow.java:149) - Created configuration for StageGraphClusterJobFlow{stages=[Stage{input=SingleInputConnector{}, streams=[RecordStream{sheetName=export, description=Export record processor}]}]}: ClusterJobConfiguration{enabledConsumers=[]}
INFO [2017-09-19 17:17:51.936] [MrPlanRunnerV2] (ClusterSession.java:197) - -------------------------------------------
INFO [2017-09-19 17:17:51.937] [MrPlanRunnerV2] (ClusterSession.java:198) - Running cluster job (TEZ) for 'Export job (1492): TwitterAnalysisExport#export(Export record processor)'
INFO [2017-09-19 17:17:51.937] [MrPlanRunnerV2] (ClusterSession.java:200) - Output (final): export (52f150a1-7292-47b0-954b-23c6ce698ce3)
INFO [2017-09-19 17:17:51.937] [MrPlanRunnerV2] (ClusterSession.java:202) - -------------------------------------------
INFO [2017-09-19 17:17:51.988] [MrPlanRunnerV2] (HiveServer2ThriftClient.java:322) - =======Chaining hive thrift transport types=======
INFO [2017-09-19 17:17:51.988] [MrPlanRunnerV2] (HiveServer2ThriftClient.java:428) - Creating NO_SASL transport with host 'demchdc8nux' port '10000' connectionTimeout 2147483647
INFO [2017-09-19 17:17:51.988] [MrPlanRunnerV2] (HiveServer2ThriftClient.java:448) - Creating PLAIN transport for user 'anonymous'
INFO [2017-09-19 17:17:51.989] [MrPlanRunnerV2] (HiveServer2ThriftClient.java:329) - =====================================
INFO [2017-09-19 17:17:52.045] [MrPlanRunnerV2] (HiveServer2ThriftClient.java:296) - Applying connection properties '{}'
INFO [2017-09-19 17:17:52.083] [MrPlanRunnerV2] (HiveServer2ThriftClient.java:91) - Underlying hive transport:org.apache.thrift.transport.TSaslClientTransport@665519d9
INFO [2017-09-19 17:17:53.911] [MrPlanRunnerV2] (FileOutputAdapter.java:175) - Initializing export to hdfs://ha-service-datalake/user/hive/warehouse/twitteranalysis/2017-09-19_17-17_53-datameer-export-%25adapter%25 with DFS[DFSClient[clientName=DFSClient_NONMAPREDUCE_1262814780_97667, ugi=datameer (auth:SIMPLE)]]
INFO [2017-09-19 17:17:53.912] [MrPlanRunnerV2] (HiveFileOutputAdapter.java:127) - Temp Output Directory set to:hdfs://ha-service-datalake/user/hive/warehouse/tmp_twitteranalysis
INFO [2017-09-19 17:17:53.950] [MrPlanRunnerV2] (HiveServer2ThriftClient.java:322) - =======Chaining hive thrift transport types=======
INFO [2017-09-19 17:17:53.950] [MrPlanRunnerV2] (HiveServer2ThriftClient.java:428) - Creating NO_SASL transport with host 'demchdc8nux' port '10000' connectionTimeout 2147483647
INFO [2017-09-19 17:17:53.950] [MrPlanRunnerV2] (HiveServer2ThriftClient.java:448) - Creating PLAIN transport for user 'anonymous'
INFO [2017-09-19 17:17:53.951] [MrPlanRunnerV2] (HiveServer2ThriftClient.java:329) - =====================================
INFO [2017-09-19 17:17:54.012] [MrPlanRunnerV2] (HiveServer2ThriftClient.java:296) - Applying connection properties '{}'
INFO [2017-09-19 17:17:54.027] [MrPlanRunnerV2] (HiveServer2ThriftClient.java:91) - Underlying hive transport:org.apache.thrift.transport.TSaslClientTransport@717bdaf
INFO [2017-09-19 17:17:54.133] [MrPlanRunnerV2] (TezJob.java:175) - Submitting DAG to Tez cluster with name:Export job (1492): TwitterAnalysisExport#export(Export record processor) (5ad3772a-7e34-43cd-ae1d-aaf180a83abf)
INFO [2017-09-19 17:17:54.134] [MrPlanRunnerV2] (LightweightDasJobContext.java:69) - Synchronize global task local resources with remote hdfs://ha-service-datalake/user/datameer/jobjars
INFO [2017-09-19 17:17:54.157] [MrPlanRunnerV2] (LightweightDasJobContext.java:85) - Synchronize job-specific task local resources with remote hdfs://ha-service-datalake/user/datameer/jobjars
INFO [2017-09-19 17:17:54.159] [MrPlanRunnerV2] (LightweightDasJobContext.java:111) - Synchronize additional task local resource 'temp/tez-execution/plugin-tez-1504684159000.jar' with remote filesystem hdfs://ha-service-datalake/user/datameer/jobjars
INFO [2017-09-19 17:17:54.161] [MrPlanRunnerV2] (LightweightDasJobContext.java:111) - Synchronize additional task local resource '/app/datameer/Datameer-6.1.24-cdh-5.11.0/webapps/conductor/WEB-INF/lib/hadoop-mapreduce-client-core-2.6.0-cdh5.11.0.jar' with remote filesystem hdfs://ha-service-datalake/user/datameer/jobjars
INFO [2017-09-19 17:17:54.180] [MrPlanRunnerV2] (TezSessionImpl.java:45) - Creating new TezClient...
INFO [2017-09-19 17:17:54.198] [MrPlanRunnerV2] (LightweightDasJobContext.java:111) - Synchronize additional task local resource 'temp/tez-execution/tez-libs-1504684159000/tez-api-0.7.1-dm1.jar' with remote filesystem hdfs://ha-service-datalake/user/datameer/jobjars
INFO [2017-09-19 17:17:54.200] [MrPlanRunnerV2] (LightweightDasJobContext.java:111) - Synchronize additional task local resource 'temp/tez-execution/tez-libs-1504684159000/tez-runtime-library-0.7.1-dm1.jar' with remote filesystem hdfs://ha-service-datalake/user/datameer/jobjars
INFO [2017-09-19 17:17:54.201] [MrPlanRunnerV2] (LightweightDasJobContext.java:111) - Synchronize additional task local resource 'temp/tez-execution/tez-libs-1504684159000/commons-collections4-4.1.jar' with remote filesystem hdfs://ha-service-datalake/user/datameer/jobjars
INFO [2017-09-19 17:17:54.203] [MrPlanRunnerV2] (LightweightDasJobContext.java:111) - Synchronize additional task local resource 'temp/tez-execution/tez-libs-1504684159000/RoaringBitmap-0.4.9.jar' with remote filesystem hdfs://ha-service-datalake/user/datameer/jobjars
INFO [2017-09-19 17:17:54.205] [MrPlanRunnerV2] (LightweightDasJobContext.java:111) - Synchronize additional task local resource 'temp/tez-execution/tez-libs-1504684159000/tez-runtime-internals-0.7.1-dm1.jar' with remote filesystem hdfs://ha-service-datalake/user/datameer/jobjars
INFO [2017-09-19 17:17:54.206] [MrPlanRunnerV2] (LightweightDasJobContext.java:111) - Synchronize additional task local resource 'temp/tez-execution/tez-libs-1504684159000/tez-yarn-timeline-history-0.7.1.jar' with remote filesystem hdfs://ha-service-datalake/user/datameer/jobjars
INFO [2017-09-19 17:17:54.208] [MrPlanRunnerV2] (LightweightDasJobContext.java:111) - Synchronize additional task local resource 'temp/tez-execution/tez-libs-1504684159000/tez-common-0.7.1-dm1.jar' with remote filesystem hdfs://ha-service-datalake/user/datameer/jobjars
INFO [2017-09-19 17:17:54.210] [MrPlanRunnerV2] (LightweightDasJobContext.java:111) - Synchronize additional task local resource 'temp/tez-execution/tez-libs-1504684159000/tez-yarn-timeline-history-with-acls-0.7.1.jar' with remote filesystem hdfs://ha-service-datalake/user/datameer/jobjars
INFO [2017-09-19 17:17:54.212] [MrPlanRunnerV2] (LightweightDasJobContext.java:111) - Synchronize additional task local resource 'temp/tez-execution/tez-libs-1504684159000/tez-dag-0.7.1-dm1.jar' with remote filesystem hdfs://ha-service-datalake/user/datameer/jobjars
INFO [2017-09-19 17:17:54.217] [MrPlanRunnerV2] (TezClient.java:173) - Tez Client Version: [ component=tez-api, version=0.7.1, revision=6868944d862113485ec98b9480e9f2445bf2b34d, SCM-URL=scm:git:https://git-wip-us.apache.org/repos/asf/tez.git, buildTime=2016-10-25T11:53:48Z ]
INFO [2017-09-19 17:17:54.217] [MrPlanRunnerV2] (TezClientFacade.java:325) - Starting Tez session ...
INFO [2017-09-19 17:17:54.235] [MrPlanRunnerV2] (TezClient.java:375) - Session mode. Starting session.
INFO [2017-09-19 17:17:54.235] [MrPlanRunnerV2] (TezClientUtils.java:173) - Using tez.lib.uris value from configuration: hdfs://ha-service-datalake/user/datameer/jobjars/6.1.24/tez-jars/tez-api-0.7.1-dm1.jar_7a3fbb77e0309e302cc87e207ed89a6e.jar,hdfs://ha-service-datalake/user/datameer/jobjars/6.1.24/tez-jars/tez-runtime-library-0.7.1-dm1.jar_1c5940d503d9646d3ad7ad96bd362dc6.jar,hdfs://ha-service-datalake/user/datameer/jobjars/6.1.24/tez-jars/commons-collections4-4.1.jar_45af6a8e5b51d5945de6c7411e290bd1.jar,hdfs://ha-service-datalake/user/datameer/jobjars/6.1.24/tez-jars/RoaringBitmap-0.4.9.jar_f4b4ae423753b1a7b34fa17810dc21fd.jar,hdfs://ha-service-datalake/user/datameer/jobjars/6.1.24/tez-jars/tez-runtime-internals-0.7.1-dm1.jar_3e9ed37ade04d36bd2584e69228551ea.jar,hdfs://ha-service-datalake/user/datameer/jobjars/6.1.24/tez-jars/tez-yarn-timeline-history-0.7.1.jar_72fe4045e4ceb69431b68b099de337f7.jar,hdfs://ha-service-datalake/user/datameer/jobjars/6.1.24/tez-jars/tez-common-0.7.1-dm1.jar_cc4aca371a1bd1aa074a8af3a0dd2e80.jar,hdfs://ha-service-datalake/user/datameer/jobjars/6.1.24/tez-jars/tez-yarn-timeline-history-with-acls-0.7.1.jar_690ec62e48dca499e8cc5fae8fb4ce52.jar,hdfs://ha-service-datalake/user/datameer/jobjars/6.1.24/tez-jars/tez-dag-0.7.1-dm1.jar_6a1704162552b8b04a831d624503e191.jar
INFO [2017-09-19 17:17:54.264] [MrPlanRunnerV2] (TezCommonUtils.java:122) - Tez system stage directory hdfs://ha-service-datalake/user/datameer/temp/job-1492/.staging-79faab98-1399-48ae-b946-5f290bdf1c89/.tez/application_1502598274024_1298 doesn't exist and is created
INFO [2017-09-19 17:17:54.501] [MrPlanRunnerV2] (YarnClientImpl.java:260) - Submitted application application_1502598274024_1298
INFO [2017-09-19 17:17:54.502] [MrPlanRunnerV2] (TezClient.java:409) - The url to track the Tez Session: http://demchdc8nux.dc4ca.siemens.de:8088/proxy/application_1502598274024_1298/
INFO [2017-09-19 17:17:54.503] [MrPlanRunnerV2] (TezClientFacade.java:327) - Starting Tez session done
INFO [2017-09-19 17:17:54.503] [MrPlanRunnerV2] (TezClientFacade.java:329) - Wait until Tez session ready (remaining attempts 2) ...
INFO [2017-09-19 17:17:57.077] [MrPlanRunnerV2] (TezClientFacade.java:331) - Wait until Tez session ready done
INFO [2017-09-19 17:17:57.081] [MrPlanRunnerV2] (DagRunner.java:64) - Submitting DAG 'Export job (1492): TwitterAnalysisExport#export(Export record processor) (5ad3772a-7e34-43cd-ae1d-aaf180a83abf)'.
INFO [2017-09-19 17:17:57.081] [MrPlanRunnerV2] (TezClient.java:452) - Submitting dag to TezSession, sessionName=Export job (1492): TwitterAnalysisExport#export(Export record processor) (5ad3772a-7e34-43cd-ae1d-aaf180a83abf), applicationId=application_1502598274024_1298, dagName=Export job (1492): TwitterAnalysisExport#export(Export record processor) (5ad3772a-7e34-43cd-ae1d-aaf180a83abf)
INFO [2017-09-19 17:17:57.369] [MrPlanRunnerV2] (TezClient.java:527) - Submitted dag to TezSession, sessionName=Export job (1492): TwitterAnalysisExport#export(Export record processor) (5ad3772a-7e34-43cd-ae1d-aaf180a83abf), applicationId=application_1502598274024_1298, dagName=Export job (1492): TwitterAnalysisExport#export(Export record processor) (5ad3772a-7e34-43cd-ae1d-aaf180a83abf)
INFO [2017-09-19 17:17:57.390] [MrPlanRunnerV2] (DagRunner.java:66) - Submitted DAG 'Export job (1492): TwitterAnalysisExport#export(Export record processor) (5ad3772a-7e34-43cd-ae1d-aaf180a83abf)'.
INFO [2017-09-19 17:17:57.391] [MrPlanRunnerV2] (DagRunner.java:115) - Waiting for DAG to finish: DAG name=Export job (1492): TwitterAnalysisExport#export(Export record processor) (5ad3772a-7e34-43cd-ae1d-aaf180a83abf), polling interval=500ms
INFO [2017-09-19 17:17:58.278] [MrPlanRunnerV2] (DagRunner.java:141) - DAG initialized: CurrentState=Running, DAG name=Export job (1492): TwitterAnalysisExport#export(Export record processor) (5ad3772a-7e34-43cd-ae1d-aaf180a83abf)
INFO [2017-09-19 17:17:58.279] [MrPlanRunnerV2] (DagRunner.java:189) - DAG status: state=RUNNING, progress=0%, Unknown task count, name=Export job (1492): TwitterAnalysisExport#export(Export record processor) (5ad3772a-7e34-43cd-ae1d-aaf180a83abf)
INFO [2017-09-19 17:18:03.297] [MrPlanRunnerV2] (DagRunner.java:189) - DAG status: state=RUNNING, progress=0%, TotalTasks: 1 Succeeded: 0 Running: 1 Failed: 0 Killed: 0 FailedTaskAttempts: 1, name=Export job (1492): TwitterAnalysisExport#export(Export record processor) (5ad3772a-7e34-43cd-ae1d-aaf180a83abf)
INFO [2017-09-19 17:18:08.315] [MrPlanRunnerV2] (DagRunner.java:189) - DAG status: state=RUNNING, progress=0%, TotalTasks: 1 Succeeded: 0 Running: 1 Failed: 0 Killed: 0 FailedTaskAttempts: 2, name=Export job (1492): TwitterAnalysisExport#export(Export record processor) (5ad3772a-7e34-43cd-ae1d-aaf180a83abf)
INFO [2017-09-19 17:18:13.332] [MrPlanRunnerV2] (DagRunner.java:189) - DAG status: state=RUNNING, progress=0%, TotalTasks: 1 Succeeded: 0 Running: 1 Failed: 0 Killed: 0 FailedTaskAttempts: 3, name=Export job (1492): TwitterAnalysisExport#export(Export record processor) (5ad3772a-7e34-43cd-ae1d-aaf180a83abf)
INFO [2017-09-19 17:18:14.838] [MrPlanRunnerV2] (DagRunner.java:189) - DAG status: state=FAILED, progress=0%, TotalTasks: 1 Succeeded: 0 Running: 0 Failed: 1 Killed: 0 FailedTaskAttempts: 4, name=Export job (1492): TwitterAnalysisExport#export(Export record processor) (5ad3772a-7e34-43cd-ae1d-aaf180a83abf)
INFO [2017-09-19 17:18:14.838] [MrPlanRunnerV2] (DagRunner.java:156) - Finished DAG 'Export job (1492): TwitterAnalysisExport#export(Export record processor) (5ad3772a-7e34-43cd-ae1d-aaf180a83abf)' (application=application_1502598274024_1298) with status=FAILED
INFO [2017-09-19 17:18:14.838] [MrPlanRunnerV2] (DatameerTezUtils.java:27) - Tasks: succeeded=0, failed=1 for 'Map for sheets:[export] (1b72f092-b4d8-467e-ad0c-b40760b4c661)'
INFO [2017-09-19 17:18:14.839] [MrPlanRunnerV2] (PoolingTezSessionFactory.java:58) - Returning TezSessionImpl{clientName=Export job (1492): TwitterAnalysisExport#export(Export record processor) (5ad3772a-7e34-43cd-ae1d-aaf180a83abf), applicationId=application_1502598274024_1298} to ReuseSessionFactory{source=AlwaysNewSessionFactory{}}.
INFO [2017-09-19 17:18:14.843] [MrPlanRunnerV2] (TezJob.java:164) - Completed Tez job 'Export job (1492): TwitterAnalysisExport#export(Export record processor)' with output path: hdfs://ha-service-datalake/user/datameer/temp/job-1492/...
INFO [2017-09-19 17:18:14.844] [MrPlanRunnerV2] (HiveFileOutputAdapter.java:127) - Temp Output Directory set to:hdfs://ha-service-datalake/user/hive/warehouse/tmp_twitteranalysis
INFO [2017-09-19 17:18:14.846] [MrPlanRunnerV2] (HiveFileOutputAdapter.java:127) - Temp Output Directory set to:hdfs://ha-service-datalake/user/hive/warehouse/tmp_twitteranalysis
INFO [2017-09-19 17:18:14.847] [MrPlanRunnerV2] (ClusterJob.java:131) - Tez Execution Framework completed cluster job 'Export job (1492): TwitterAnalysisExport#export(Export record processor)' [22 sec]
ERROR [2017-09-19 17:18:14.847] [MrPlanRunnerV2] (ClusterSession.java:220) - Failed to run cluster job 'Export job (1492): TwitterAnalysisExport#export(Export record processor)' [22 sec]
datameer.com.google.common.base.VerifyException: Finished DAG 'Export job (1492): TwitterAnalysisExport#export(Export record processor) (5ad3772a-7e34-43cd-ae1d-aaf180a83abf)' (application_1502598274024_1298) with state FAILED and diagnostics: [Vertex failed, vertexName=Map for sheets:[export] (1b72f092-b4d8-467e-ad0c-b40760b4c661), vertexId=vertex_1502598274024_1298_1_00, diagnostics=[Task failed, taskId=task_1502598274024_1298_1_00_000000, diagnostics=[TaskAttempt 0 failed, info=[Error: Fatal Error cause TezChild exit.:java.lang.NoClassDefFoundError: org/apache/hadoop/mapred/MRVersion
at org.apache.hadoop.hive.shims.Hadoop23Shims.isMR2(Hadoop23Shims.java:909)
at org.apache.hadoop.hive.shims.Hadoop23Shims.getHadoopConfNames(Hadoop23Shims.java:980)
at org.apache.hadoop.hive.conf.HiveConf$ConfVars.<clinit>(HiveConf.java:363)
at org.apache.hadoop.hive.serde2.lazy.LazySerDeParameters.<init>(LazySerDeParameters.java:87)
at org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe.initialize(LazySimpleSerDe.java:122)
at datameer.das.plugin.hive.HiveUtil.getSerializer(HiveUtil.java:285)
at datameer.das.plugin.hive.HiveFileOutputAdapter.connect(HiveFileOutputAdapter.java:107)
at datameer.dap.sdk.util.FileOutputAdapter.connect(FileOutputAdapter.java:421)
at datameer.dap.sdk.util.FileOutputAdapter.openNewFile(FileOutputAdapter.java:254)
at datameer.dap.sdk.util.FileOutputAdapter.connectExportInstance(FileOutputAdapter.java:244)
at datameer.das.plugin.hive.HiveOutputAdapter.connectExportInstance(HiveOutputAdapter.java:124)
at datameer.dap.common.graphv2.ProcessingContext.initializeOutputAdapter(ProcessingContext.java:395)
at datameer.dap.common.job.dapexport.ExportRecordProcessor.process(ExportRecordProcessor.java:40)
at datameer.dap.common.graphv2.ClusterTaskOperations$Operation.connect(ClusterTaskOperations.java:113)
at datameer.dap.common.graphv2.ClusterTaskOperations.connectOperationsToSource(ClusterTaskOperations.java:162)
at datameer.dap.common.graphv2.ClusterTaskOperations.connectOperationsByAlias(ClusterTaskOperations.java:157)
at datameer.dap.common.graphv2.ClusterTaskOperations.connect(ClusterTaskOperations.java:151)
at datameer.plugin.tez.processing.SimpleVertexProcessor.run(SimpleVertexProcessor.java:121)
at org.apache.tez.runtime.LogicalIOProcessorRuntimeTask.run(LogicalIOProcessorRuntimeTask.java:362)
at org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable$1.run(TezTaskRunner.java:179)
at org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable$1.run(TezTaskRunner.java:171)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1920)
at org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable.callInternal(TezTaskRunner.java:171)
at org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable.callInternal(TezTaskRunner.java:167)
at org.apache.tez.common.CallableWithNdc.call(CallableWithNdc.java:36)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.mapred.MRVersion
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at datameer.dap.sdk.plugin.PluginClassLoader.loadClass(PluginClassLoader.java:61)
... 31 more
], TaskAttempt 1 failed, info=[Error: Fatal Error cause TezChild exit.:java.lang.NoClassDefFoundError: org/apache/hadoop/mapred/MRVersion
at org.apache.hadoop.hive.shims.Hadoop23Shims.isMR2(Hadoop23Shims.java:909)
at org.apache.hadoop.hive.shims.Hadoop23Shims.getHadoopConfNames(Hadoop23Shims.java:980)
at org.apache.hadoop.hive.conf.HiveConf$ConfVars.<clinit>(HiveConf.java:363)
at org.apache.hadoop.hive.serde2.lazy.LazySerDeParameters.<init>(LazySerDeParameters.java:87)
at org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe.initialize(LazySimpleSerDe.java:122)
at datameer.das.plugin.hive.HiveUtil.getSerializer(HiveUtil.java:285)
at datameer.das.plugin.hive.HiveFileOutputAdapter.connect(HiveFileOutputAdapter.java:107)
at datameer.dap.sdk.util.FileOutputAdapter.connect(FileOutputAdapter.java:421)
at datameer.dap.sdk.util.FileOutputAdapter.openNewFile(FileOutputAdapter.java:254)
at datameer.dap.sdk.util.FileOutputAdapter.connectExportInstance(FileOutputAdapter.java:244)
at datameer.das.plugin.hive.HiveOutputAdapter.connectExportInstance(HiveOutputAdapter.java:124)
at datameer.dap.common.graphv2.ProcessingContext.initializeOutputAdapter(ProcessingContext.java:395)
at datameer.dap.common.job.dapexport.ExportRecordProcessor.process(ExportRecordProcessor.java:40)
at datameer.dap.common.graphv2.ClusterTaskOperations$Operation.connect(ClusterTaskOperations.java:113)
at datameer.dap.common.graphv2.ClusterTaskOperations.connectOperationsToSource(ClusterTaskOperations.java:162)
at datameer.dap.common.graphv2.ClusterTaskOperations.connectOperationsByAlias(ClusterTaskOperations.java:157)
at datameer.dap.common.graphv2.ClusterTaskOperations.connect(ClusterTaskOperations.java:151)
at datameer.plugin.tez.processing.SimpleVertexProcessor.run(SimpleVertexProcessor.java:121)
at org.apache.tez.runtime.LogicalIOProcessorRuntimeTask.run(LogicalIOProcessorRuntimeTask.java:362)
at org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable$1.run(TezTaskRunner.java:179)
at org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable$1.run(TezTaskRunner.java:171)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1920)
at org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable.callInternal(TezTaskRunner.java:171)
at org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable.callInternal(TezTaskRunner.java:167)
at org.apache.tez.common.CallableWithNdc.call(CallableWithNdc.java:36)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.mapred.MRVersion
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at datameer.dap.sdk.plugin.PluginClassLoader.loadClass(PluginClassLoader.java:61)
... 31 more
], TaskAttempt 2 failed, info=[Error: Fatal Error cause TezChild exit.:java.lang.NoClassDefFoundError: org/apache/hadoop/mapred/MRVersion
at org.apache.hadoop.hive.shims.Hadoop23Shims.isMR2(Hadoop23Shims.java:909)
at org.apache.hadoop.hive.shims.Hadoop23Shims.getHadoopConfNames(Hadoop23Shims.java:980)
at org.apache.hadoop.hive.conf.HiveConf$ConfVars.<clinit>(HiveConf.java:363)
at org.apache.hadoop.hive.serde2.lazy.LazySerDeParameters.<init>(LazySerDeParameters.java:87)
at org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe.initialize(LazySimpleSerDe.java:122)
at datameer.das.plugin.hive.HiveUtil.getSerializer(HiveUtil.java:285)
at datameer.das.plugin.hive.HiveFileOutputAdapter.connect(HiveFileOutputAdapter.java:107)
at datameer.dap.sdk.util.FileOutputAdapter.connect(FileOutputAdapter.java:421)
at datameer.dap.sdk.util.FileOutputAdapter.openNewFile(FileOutputAdapter.java:254)
at datameer.dap.sdk.util.FileOutputAdapter.connectExportInstance(FileOutputAdapter.java:244)
at datameer.das.plugin.hive.HiveOutputAdapter.connectExportInstance(HiveOutputAdapter.java:124)
at datameer.dap.common.graphv2.ProcessingContext.initializeOutputAdapter(ProcessingContext.java:395)
at datameer.dap.common.job.dapexport.ExportRecordProcessor.process(ExportRecordProcessor.java:40)
at datameer.dap.common.graphv2.ClusterTaskOperations$Operation.connect(ClusterTaskOperations.java:113)
at datameer.dap.common.graphv2.ClusterTaskOperations.connectOperationsToSource(ClusterTaskOperations.java:162)
at datameer.dap.common.graphv2.ClusterTaskOperations.connectOperationsByAlias(ClusterTaskOperations.java:157)
at datameer.dap.common.graphv2.ClusterTaskOperations.connect(ClusterTaskOperations.java:151)
at datameer.plugin.tez.processing.SimpleVertexProcessor.run(SimpleVertexProcessor.java:121)
at org.apache.tez.runtime.LogicalIOProcessorRuntimeTask.run(LogicalIOProcessorRuntimeTask.java:362)
at org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable$1.run(TezTaskRunner.java:179)
at org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable$1.run(TezTaskRunner.java:171)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1920)
at org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable.callInternal(TezTaskRunner.java:171)
at org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable.callInternal(TezTaskRunner.java:167)
at org.apache.tez.common.CallableWithNdc.call(CallableWithNdc.java:36)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.mapred.MRVersion
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at datameer.dap.sdk.plugin.PluginClassLoader.loadClass(PluginClassLoader.java:61)
... 31 more
], TaskAttempt 3 failed, info=[Error: Fatal Error cause TezChild exit.:java.lang.NoClassDefFoundError: org/apache/hadoop/mapred/MRVersion
at org.apache.hadoop.hive.shims.Hadoop23Shims.isMR2(Hadoop23Shims.java:909)
at org.apache.hadoop.hive.shims.Hadoop23Shims.getHadoopConfNames(Hadoop23Shims.java:980)
at org.apache.hadoop.hive.conf.HiveConf$ConfVars.<clinit>(HiveConf.java:363)
at org.apache.hadoop.hive.serde2.lazy.LazySerDeParameters.<init>(LazySerDeParameters.java:87)
at org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe.initialize(LazySimpleSerDe.java:122)
at datameer.das.plugin.hive.HiveUtil.getSerializer(HiveUtil.java:285)
at datameer.das.plugin.hive.HiveFileOutputAdapter.connect(HiveFileOutputAdapter.java:107)
at datameer.dap.sdk.util.FileOutputAdapter.connect(FileOutputAdapter.java:421)
at datameer.dap.sdk.util.FileOutputAdapter.openNewFile(FileOutputAdapter.java:254)
at datameer.dap.sdk.util.FileOutputAdapter.connectExportInstance(FileOutputAdapter.java:244)
at datameer.das.plugin.hive.HiveOutputAdapter.connectExportInstance(HiveOutputAdapter.java:124)
at datameer.dap.common.graphv2.ProcessingContext.initializeOutputAdapter(ProcessingContext.java:395)
at datameer.dap.common.job.dapexport.ExportRecordProcessor.process(ExportRecordProcessor.java:40)
at datameer.dap.common.graphv2.ClusterTaskOperations$Operation.connect(ClusterTaskOperations.java:113)
at datameer.dap.common.graphv2.ClusterTaskOperations.connectOperationsToSource(ClusterTaskOperations.java:162)
at datameer.dap.common.graphv2.ClusterTaskOperations.connectOperationsByAlias(ClusterTaskOperations.java:157)
at datameer.dap.common.graphv2.ClusterTaskOperations.connect(ClusterTaskOperations.java:151)
at datameer.plugin.tez.processing.SimpleVertexProcessor.run(SimpleVertexProcessor.java:121)
at org.apache.tez.runtime.LogicalIOProcessorRuntimeTask.run(LogicalIOProcessorRuntimeTask.java:362)
at org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable$1.run(TezTaskRunner.java:179)
at org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable$1.run(TezTaskRunner.java:171)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1920)
at org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable.callInternal(TezTaskRunner.java:171)
at org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable.callInternal(TezTaskRunner.java:167)
at org.apache.tez.common.CallableWithNdc.call(CallableWithNdc.java:36)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.mapred.MRVersion
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at datameer.dap.sdk.plugin.PluginClassLoader.loadClass(PluginClassLoader.java:61)
... 31 more
]], Vertex did not succeed due to OWN_TASK_FAILURE, failedTasks:1 killedTasks:0, Vertex vertex_1502598274024_1298_1_00 [Map for sheets:[export] (1b72f092-b4d8-467e-ad0c-b40760b4c661)] killed/failed due to:OWN_TASK_FAILURE], DAG did not succeed due to VERTEX_FAILURE. failedVertices:1 killedVertices:0]
at datameer.com.google.common.base.Verify.verify(Verify.java:125)
at datameer.plugin.tez.TezJob.runTezDag(TezJob.java:180)
at datameer.plugin.tez.TezJob.runImpl(TezJob.java:154)
at datameer.dap.common.graphv2.ClusterJob.run(ClusterJob.java:125)
at datameer.dap.common.graphv2.ClusterSession.execute(ClusterSession.java:206)
at datameer.dap.common.graphv2.ClusterSession.runAllClusterJobs(ClusterSession.java:343)
at datameer.dap.common.graphv2.MrPlanRunnerV2.run(MrPlanRunnerV2.java:129)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at datameer.dap.common.security.DatameerSecurityService.runAsUser(DatameerSecurityService.java:109)
at datameer.dap.common.security.DatameerSecurityService.runAsUser(DatameerSecurityService.java:186)
at datameer.dap.common.security.RunAsThread$1.run(RunAsThread.java:34)
at datameer.dap.common.security.RunAsThread$1.run(RunAsThread.java:30)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1920)
at datameer.dap.common.filesystem.Impersonator.doAs(Impersonator.java:31)
at datameer.dap.common.security.RunAsThread.run(RunAsThread.java:30)
INFO [2017-09-19 17:18:14.848] [MrPlanRunnerV2] (ClusterSession.java:223) - -------------------------------------------
INFO [2017-09-19 17:18:14.849] [MrPlanRunnerV2] (ClusterSession.java:81) - Committing failed job and moving job output from 'hdfs://ha-service-datalake/user/datameer/temp/job-1492' to 'hdfs://ha-service-datalake/user/datameer/exportjobs/247/1492'.
INFO [2017-09-19 17:18:14.856] [MrPlanRunnerV2] (ClusterSession.java:129) - Completed job flow with FAILURE and 0 completed cluster jobs. (hdfs://ha-service-datalake/user/datameer/exportjobs/247/1492)
INFO [2017-09-19 17:18:14.857] [MrPlanRunnerV2] (PoolingTezSessionFactory.java:150) - Closing ReuseSessionFactory{source=AlwaysNewSessionFactory{}}.
INFO [2017-09-19 17:18:14.857] [MrPlanRunnerV2] (TezSessionImpl.java:70) - Closing TezSessionImpl{clientName=Export job (1492): TwitterAnalysisExport#export(Export record processor) (5ad3772a-7e34-43cd-ae1d-aaf180a83abf), applicationId=application_1502598274024_1298}
INFO [2017-09-19 17:18:14.857] [MrPlanRunnerV2] (TezClient.java:548) - Shutting down Tez Session, sessionName=Export job (1492): TwitterAnalysisExport#export(Export record processor) (5ad3772a-7e34-43cd-ae1d-aaf180a83abf), applicationId=application_1502598274024_1298
INFO [2017-09-19 17:18:14.988] [MrPlanRunnerV2] (HarBuilder.java:77) - Created har file at hdfs://ha-service-datalake/user/datameer/jobhistory/247/1492/job-metadata.har.tmp out of [hdfs://ha-service-datalake/user/datameer/jobhistory/247/1492/cluster-jobs.json, hdfs://ha-service-datalake/user/datameer/jobhistory/247/1492/job-conf.xml, hdfs://ha-service-datalake/user/datameer/jobhistory/247/1492/job-definition-tweetAnalysis.json, hdfs://ha-service-datalake/user/datameer/jobhistory/247/1492/job-definition.json, hdfs://ha-service-datalake/user/datameer/jobhistory/247/1492/job-plan-compiled.dot, hdfs://ha-service-datalake/user/datameer/jobhistory/247/1492/job-plan-original.dot, hdfs://ha-service-datalake/user/datameer/jobhistory/247/1492/application_1502598274024_1298/job-conf-cluster.xml]. Moving it to hdfs://ha-service-datalake/user/datameer/jobhistory/247/1492/job-metadata.har
INFO [2017-09-19 17:18:15.007] [MrPlanRunnerV2] (ClusterSession.java:364) - Deleting temporary job directory hdfs://ha-service-datalake/user/datameer/temp/job-1492
INFO [2017-09-19 17:18:15.023] [MrPlanRunnerV2] (DatameerJobStorage.java:160) - Copying job execution trace log from /app/datameer/Datameer-6.1.24-cdh-5.11.0/temp/cache/dfscache/local-job-execution-traces/1492 to hdfs://ha-service-datalake/user/datameer/jobhistory/247/1492/job-execution-trace.log
INFO [2017-09-19 17:18:15.035] [JobScheduler worker1-thread-433] (DapJobCounter.java:172) - Job FAILURE with '0' mr-jobs and following counters:
INFO [2017-09-19 17:18:15.035] [JobScheduler worker1-thread-433] (DapJobCounter.java:175) - EXPORT_RECORDS: 0
INFO [2017-09-19 17:18:15.035] [JobScheduler worker1-thread-433] (DapJobCounter.java:175) - EXPORT_DROPPED_RECORDS: 0
ERROR [2017-09-19 17:18:15.974] [JobScheduler thread-1] (JobScheduler.java:829) - Job 1492 failed with exception.
java.lang.RuntimeException: Failed to run cluster job for 'Export job (1492): TwitterAnalysisExport#export(Export record processor)'
at datameer.dap.common.graphv2.ClusterSession.execute(ClusterSession.java:221)
at datameer.dap.common.graphv2.ClusterSession.runAllClusterJobs(ClusterSession.java:343)
at datameer.dap.common.graphv2.MrPlanRunnerV2.run(MrPlanRunnerV2.java:129)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at datameer.dap.common.security.DatameerSecurityService.runAsUser(DatameerSecurityService.java:109)
at datameer.dap.common.security.DatameerSecurityService.runAsUser(DatameerSecurityService.java:186)
at datameer.dap.common.security.RunAsThread$1.run(RunAsThread.java:34)
at datameer.dap.common.security.RunAsThread$1.run(RunAsThread.java:30)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1920)
at datameer.dap.common.filesystem.Impersonator.doAs(Impersonator.java:31)
at datameer.dap.common.security.RunAsThread.run(RunAsThread.java:30)
Caused by: datameer.com.google.common.base.VerifyException: Finished DAG 'Export job (1492): TwitterAnalysisExport#export(Export record processor) (5ad3772a-7e34-43cd-ae1d-aaf180a83abf)' (application_1502598274024_1298) with state FAILED and diagnostics: [Vertex failed, vertexName=Map for sheets:[export] (1b72f092-b4d8-467e-ad0c-b40760b4c661), vertexId=vertex_1502598274024_1298_1_00, diagnostics=[Task failed, taskId=task_1502598274024_1298_1_00_000000, diagnostics=[TaskAttempt 0 failed, info=[Error: Fatal Error cause TezChild exit.:java.lang.NoClassDefFoundError: org/apache/hadoop/mapred/MRVersion
at org.apache.hadoop.hive.shims.Hadoop23Shims.isMR2(Hadoop23Shims.java:909)
at org.apache.hadoop.hive.shims.Hadoop23Shims.getHadoopConfNames(Hadoop23Shims.java:980)
at org.apache.hadoop.hive.conf.HiveConf$ConfVars.<clinit>(HiveConf.java:363)
at org.apache.hadoop.hive.serde2.lazy.LazySerDeParameters.<init>(LazySerDeParameters.java:87)
at org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe.initialize(LazySimpleSerDe.java:122)
at datameer.das.plugin.hive.HiveUtil.getSerializer(HiveUtil.java:285)
at datameer.das.plugin.hive.HiveFileOutputAdapter.connect(HiveFileOutputAdapter.java:107)
at datameer.dap.sdk.util.FileOutputAdapter.connect(FileOutputAdapter.java:421)
at datameer.dap.sdk.util.FileOutputAdapter.openNewFile(FileOutputAdapter.java:254)
at datameer.dap.sdk.util.FileOutputAdapter.connectExportInstance(FileOutputAdapter.java:244)
at datameer.das.plugin.hive.HiveOutputAdapter.connectExportInstance(HiveOutputAdapter.java:124)
at datameer.dap.common.graphv2.ProcessingContext.initializeOutputAdapter(ProcessingContext.java:395)
at datameer.dap.common.job.dapexport.ExportRecordProcessor.process(ExportRecordProcessor.java:40)
at datameer.dap.common.graphv2.ClusterTaskOperations$Operation.connect(ClusterTaskOperations.java:113)
at datameer.dap.common.graphv2.ClusterTaskOperations.connectOperationsToSource(ClusterTaskOperations.java:162)
at datameer.dap.common.graphv2.ClusterTaskOperations.connectOperationsByAlias(ClusterTaskOperations.java:157)
at datameer.dap.common.graphv2.ClusterTaskOperations.connect(ClusterTaskOperations.java:151)
at datameer.plugin.tez.processing.SimpleVertexProcessor.run(SimpleVertexProcessor.java:121)
at org.apache.tez.runtime.LogicalIOProcessorRuntimeTask.run(LogicalIOProcessorRuntimeTask.java:362)
at org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable$1.run(TezTaskRunner.java:179)
at org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable$1.run(TezTaskRunner.java:171)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1920)
at org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable.callInternal(TezTaskRunner.java:171)
at org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable.callInternal(TezTaskRunner.java:167)
at org.apache.tez.common.CallableWithNdc.call(CallableWithNdc.java:36)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.mapred.MRVersion
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at datameer.dap.sdk.plugin.PluginClassLoader.loadClass(PluginClassLoader.java:61)
... 31 more
], TaskAttempt 1 failed, info=[Error: Fatal Error cause TezChild exit.:java.lang.NoClassDefFoundError: org/apache/hadoop/mapred/MRVersion
at org.apache.hadoop.hive.shims.Hadoop23Shims.isMR2(Hadoop23Shims.java:909)
at org.apache.hadoop.hive.shims.Hadoop23Shims.getHadoopConfNames(Hadoop23Shims.java:980)
at org.apache.hadoop.hive.conf.HiveConf$ConfVars.<clinit>(HiveConf.java:363)
at org.apache.hadoop.hive.serde2.lazy.LazySerDeParameters.<init>(LazySerDeParameters.java:87)
at org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe.initialize(LazySimpleSerDe.java:122)
at datameer.das.plugin.hive.HiveUtil.getSerializer(HiveUtil.java:285)
at datameer.das.plugin.hive.HiveFileOutputAdapter.connect(HiveFileOutputAdapter.java:107)
at datameer.dap.sdk.util.FileOutputAdapter.connect(FileOutputAdapter.java:421)
at datameer.dap.sdk.util.FileOutputAdapter.openNewFile(FileOutputAdapter.java:254)
at datameer.dap.sdk.util.FileOutputAdapter.connectExportInstance(FileOutputAdapter.java:244)
at datameer.das.plugin.hive.HiveOutputAdapter.connectExportInstance(HiveOutputAdapter.java:124)
at datameer.dap.common.graphv2.ProcessingContext.initializeOutputAdapter(ProcessingContext.java:395)
at datameer.dap.common.job.dapexport.ExportRecordProcessor.process(ExportRecordProcessor.java:40)
at datameer.dap.common.graphv2.ClusterTaskOperations$Operation.connect(ClusterTaskOperations.java:113)
at datameer.dap.common.graphv2.ClusterTaskOperations.connectOperationsToSource(ClusterTaskOperations.java:162)
at datameer.dap.common.graphv2.ClusterTaskOperations.connectOperationsByAlias(ClusterTaskOperations.java:157)
at datameer.dap.common.graphv2.ClusterTaskOperations.connect(ClusterTaskOperations.java:151)
at datameer.plugin.tez.processing.SimpleVertexProcessor.run(SimpleVertexProcessor.java:121)
at org.apache.tez.runtime.LogicalIOProcessorRuntimeTask.run(LogicalIOProcessorRuntimeTask.java:362)
at org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable$1.run(TezTaskRunner.java:179)
at org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable$1.run(TezTaskRunner.java:171)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1920)
at org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable.callInternal(TezTaskRunner.java:171)
at org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable.callInternal(TezTaskRunner.java:167)
at org.apache.tez.common.CallableWithNdc.call(CallableWithNdc.java:36)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.mapred.MRVersion
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at datameer.dap.sdk.plugin.PluginClassLoader.loadClass(PluginClassLoader.java:61)
... 31 more
], TaskAttempt 2 failed, info=[Error: Fatal Error cause TezChild exit.:java.lang.NoClassDefFoundError: org/apache/hadoop/mapred/MRVersion
at org.apache.hadoop.hive.shims.Hadoop23Shims.isMR2(Hadoop23Shims.java:909)
at org.apache.hadoop.hive.shims.Hadoop23Shims.getHadoopConfNames(Hadoop23Shims.java:980)
at org.apache.hadoop.hive.conf.HiveConf$ConfVars.<clinit>(HiveConf.java:363)
at org.apache.hadoop.hive.serde2.lazy.LazySerDeParameters.<init>(LazySerDeParameters.java:87)
at org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe.initialize(LazySimpleSerDe.java:122)
at datameer.das.plugin.hive.HiveUtil.getSerializer(HiveUtil.java:285)
at datameer.das.plugin.hive.HiveFileOutputAdapter.connect(HiveFileOutputAdapter.java:107)
at datameer.dap.sdk.util.FileOutputAdapter.connect(FileOutputAdapter.java:421)
at datameer.dap.sdk.util.FileOutputAdapter.openNewFile(FileOutputAdapter.java:254)
at datameer.dap.sdk.util.FileOutputAdapter.connectExportInstance(FileOutputAdapter.java:244)
at datameer.das.plugin.hive.HiveOutputAdapter.connectExportInstance(HiveOutputAdapter.java:124)
at datameer.dap.common.graphv2.ProcessingContext.initializeOutputAdapter(ProcessingContext.java:395)
at datameer.dap.common.job.dapexport.ExportRecordProcessor.process(ExportRecordProcessor.java:40)
at datameer.dap.common.graphv2.ClusterTaskOperations$Operation.connect(ClusterTaskOperations.java:113)
at datameer.dap.common.graphv2.ClusterTaskOperations.connectOperationsToSource(ClusterTaskOperations.java:162)
at datameer.dap.common.graphv2.ClusterTaskOperations.connectOperationsByAlias(ClusterTaskOperations.java:157)
at datameer.dap.common.graphv2.ClusterTaskOperations.connect(ClusterTaskOperations.java:151)
at datameer.plugin.tez.processing.SimpleVertexProcessor.run(SimpleVertexProcessor.java:121)
at org.apache.tez.runtime.LogicalIOProcessorRuntimeTask.run(LogicalIOProcessorRuntimeTask.java:362)
at org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable$1.run(TezTaskRunner.java:179)
at org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable$1.run(TezTaskRunner.java:171)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1920)
at org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable.callInternal(TezTaskRunner.java:171)
at org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable.callInternal(TezTaskRunner.java:167)
at org.apache.tez.common.CallableWithNdc.call(CallableWithNdc.java:36)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.mapred.MRVersion
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at datameer.dap.sdk.plugin.PluginClassLoader.loadClass(PluginClassLoader.java:61)
... 31 more
], TaskAttempt 3 failed, info=[Error: Fatal Error cause TezChild exit.:java.lang.NoClassDefFoundError: org/apache/hadoop/mapred/MRVersion
at org.apache.hadoop.hive.shims.Hadoop23Shims.isMR2(Hadoop23Shims.java:909)
at org.apache.hadoop.hive.shims.Hadoop23Shims.getHadoopConfNames(Hadoop23Shims.java:980)
at org.apache.hadoop.hive.conf.HiveConf$ConfVars.<clinit>(HiveConf.java:363)
at org.apache.hadoop.hive.serde2.lazy.LazySerDeParameters.<init>(LazySerDeParameters.java:87)
at org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe.initialize(LazySimpleSerDe.java:122)
at datameer.das.plugin.hive.HiveUtil.getSerializer(HiveUtil.java:285)
at datameer.das.plugin.hive.HiveFileOutputAdapter.connect(HiveFileOutputAdapter.java:107)
at datameer.dap.sdk.util.FileOutputAdapter.connect(FileOutputAdapter.java:421)
at datameer.dap.sdk.util.FileOutputAdapter.openNewFile(FileOutputAdapter.java:254)
at datameer.dap.sdk.util.FileOutputAdapter.connectExportInstance(FileOutputAdapter.java:244)
at datameer.das.plugin.hive.HiveOutputAdapter.connectExportInstance(HiveOutputAdapter.java:124)
at datameer.dap.common.graphv2.ProcessingContext.initializeOutputAdapter(ProcessingContext.java:395)
at datameer.dap.common.job.dapexport.ExportRecordProcessor.process(ExportRecordProcessor.java:40)
at datameer.dap.common.graphv2.ClusterTaskOperations$Operation.connect(ClusterTaskOperations.java:113)
at datameer.dap.common.graphv2.ClusterTaskOperations.connectOperationsToSource(ClusterTaskOperations.java:162)
at datameer.dap.common.graphv2.ClusterTaskOperations.connectOperationsByAlias(ClusterTaskOperations.java:157)
at datameer.dap.common.graphv2.ClusterTaskOperations.connect(ClusterTaskOperations.java:151)
at datameer.plugin.tez.processing.SimpleVertexProcessor.run(SimpleVertexProcessor.java:121)
at org.apache.tez.runtime.LogicalIOProcessorRuntimeTask.run(LogicalIOProcessorRuntimeTask.java:362)
at org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable$1.run(TezTaskRunner.java:179)
at org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable$1.run(TezTaskRunner.java:171)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1920)
at org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable.callInternal(TezTaskRunner.java:171)
at org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable.callInternal(TezTaskRunner.java:167)
at org.apache.tez.common.CallableWithNdc.call(CallableWithNdc.java:36)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.mapred.MRVersion
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at datameer.dap.sdk.plugin.PluginClassLoader.loadClass(PluginClassLoader.java:61)
... 31 more
]], Vertex did not succeed due to OWN_TASK_FAILURE, failedTasks:1 killedTasks:0, Vertex vertex_1502598274024_1298_1_00 [Map for sheets:[export] (1b72f092-b4d8-467e-ad0c-b40760b4c661)] killed/failed due to:OWN_TASK_FAILURE], DAG did not succeed due to VERTEX_FAILURE. failedVertices:1 killedVertices:0]
at datameer.com.google.common.base.Verify.verify(Verify.java:125)
at datameer.plugin.tez.TezJob.runTezDag(TezJob.java:180)
at datameer.plugin.tez.TezJob.runImpl(TezJob.java:154)
at datameer.dap.common.graphv2.ClusterJob.run(ClusterJob.java:125)
at datameer.dap.common.graphv2.ClusterSession.execute(ClusterSession.java:206)
... 12 more
INFO [2017-09-19 17:18:16.002] [JobScheduler thread-1] (JobScheduler.java:904) - Computing after job completion operations for execution 1492 (type=NORMAL)
INFO [2017-09-19 17:18:16.002] [JobScheduler thread-1] (JobScheduler.java:908) - Finished computing after job completion operations for execution 1492 (type=NORMAL) [0 sec]
WARN [2017-09-19 17:18:16.010] [JobScheduler thread-1] (JobScheduler.java:759) - Job DapJobExecution{id=1492, type=NORMAL, status=ERROR} completed with status ERROR.

Any ideas?

Michael Ahn

5 comments

  • Avatar
    Joel Stewart

    It appears that the root of the exception is this one: 

    Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.mapred.MRVersion

    Based on that, I'm inclined to take a close look at the classpath for the Datameer environment. This particular class is included in the hadoop-mapred-*.jar file. Can you review the class path that is defined within the Datameer Administration -> Hadoop Configuration page to identify if the underlying jar file exists within that path? 

     

    0
  • Avatar
    Michael Ahn

    I'm using the auto configuration feature of Datameer for the Hadoop Cluster on CDH 5.11. The YARN classpath is set to

    $HADOOP_CLIENT_CONF_DIR,$HADOOP_CONF_DIR,$HADOOP_COMMON_HOME/*,$HADOOP_COMMON_HOME/lib/*,$HADOOP_HDFS_HOME/*,$HADOOP_HDFS_HOME/lib/*,$HADOOP_YARN_HOME/*,$HADOOP_YARN_HOME/lib/*

    I found the missing class in $HADOOP_COMMON_HOME/lib/hadoop/client-0.20/hadoop-core.jar

    I changed the cluster configuration to Manual and added $HADOOP_COMMON_HOME/lib/hadoop/client-0.20/* to the YARN classpath

    Now it works. Looks like a bug in the cluster auto configuration for CDH.

     

    0
  • Avatar
    Joel Stewart

    Thanks for the update Michael, I'm glad to hear that the adjustment to the classpath resolved the issue. With respect to the auto-configuration, did Datameer successfully match the YARN Classpath or the output from 'yarn classpath' as a command? This is the design intention of the auto-configuration.

    In this case, it seems that an additional class was unused in other CDH workflows and was now relevant to these Datameer jobs. 

    0
  • Avatar
    Michael Ahn

    The result of yarn classpath command is:

    [datameer@dn183 ~]$ yarn classpath
    /etc/hadoop/conf:/etc/hadoop/conf:/etc/hadoop/conf:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/libexec/../../hadoop/lib/*:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/libexec/../../hadoop/.//*:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/libexec/../../hadoop-hdfs/./:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/libexec/../../hadoop-hdfs/lib/*:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/libexec/../../hadoop-hdfs/.//*:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/libexec/../../hadoop-yarn/lib/*:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/libexec/../../hadoop-yarn/.//*:/opt/cloudera/parcels/CDH/lib/hadoop-mapreduce/lib/*:/opt/cloudera/parcels/CDH/lib/hadoop-mapreduce/.//*:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/libexec/../../hadoop-yarn/.//*:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/libexec/../../hadoop-yarn/lib/*

    If I use this as Yarn Classpath in Datameer with manual cluster configuration the Hive connection is working. The Yarn classpath used in auto configuration (from job-conf.xml) is:

    $HADOOP_CLIENT_CONF_DIR,$HADOOP_CONF_DIR,$HADOOP_COMMON_HOME/*,$HADOOP_COMMON_HOME/lib/*,$HADOOP_HDFS_HOME/*,$HADOOP_HDFS_HOME/lib/*,$HADOOP_YARN_HOME/*,$HADOOP_YARN_HOME/lib/*

    This looks like the default value and is not specific for CDH.

    I found out that you can set the Yarn class path also in auto configuration mode by adding

    yarn.application.classpath=/etc/hadoop/conf,/opt/cloudera/parcels/CDH/lib/hadoop/*,/opt/cloudera/parcels/CDH/lib/hadoop/lib/*,/opt/cloudera/parcels/CDH/lib/hadoop-hdfs/*,/opt/cloudera/parcels/CDH/lib/hadoop-hdfs/lib/*,/opt/cloudera/parcels/CDH/lib/hadoop-yarn/*,/opt/cloudera/parcels/CDH/lib/hadoop-yarn/lib/*,/opt/cloudera/parcels/CDH/lib/hadoop-mapreduce/*,/opt/cloudera/parcels/CDH/lib/hadoop-mapreduce/lib/*

    in the box "Hadoop Distribution Specific Properties:"

    0
  • Avatar
    Joel Stewart

    Thanks Michael, I'll share this technical feedback with our Product Team so that we can improve the auto configuration in a future release. 

    0
Please sign in to leave a comment.