hadoop执行sqoop任务找不到jar
sqoop:1.4.7
hadoop:3.4.1
数据:oracel-hdfs
2025-04-15 16:57:00,850 INFO sqoop.Sqoop: Running Sqoop version: 1.4.7
2025-04-15 16:57:00,901 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
2025-04-15 16:57:00,965 INFO oracle.OraOopManagerFactory: Data Connector for Oracle and Hadoop is disabled.
2025-04-15 16:57:00,975 INFO manager.SqlManager: Using default fetchSize of 1000
2025-04-15 16:57:00,975 INFO tool.CodeGenTool: Beginning code generation
2025-04-15 16:57:01,920 INFO manager.OracleManager: Time zone has been set to GMT
2025-04-15 16:57:01,953 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM HYSH_CX.NCME_CREDIT_TYPE t WHERE 1=0
2025-04-15 16:57:01,984 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /opt/datasophon/hdfs
Note: /tmp/sqoop-hdfs/compile/74c258101e7c0e44855b37b09735db0e/HYSH_CX_NCME_CREDIT_TYPE.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
2025-04-15 16:57:03,093 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-hdfs/compile/74c258101e7c0e44855b37b09735db0e/HYSH_CX.NCME_CREDIT_TYPE.jar
2025-04-15 16:57:03,791 INFO tool.ImportTool: Destination directory /user/hdfs/NCME_CREDIT_TYPE deleted.
2025-04-15 16:57:03,823 INFO manager.OracleManager: Time zone has been set to GMT
2025-04-15 16:57:03,831 INFO mapreduce.ImportJobBase: Beginning import of HYSH_CX.NCME_CREDIT_TYPE
2025-04-15 16:57:03,832 INFO Configuration.deprecation: mapred.job.tracker is deprecated. Instead, use mapreduce.jobtracker.address
2025-04-15 16:57:03,836 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar
2025-04-15 16:57:03,843 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
2025-04-15 16:57:03,911 INFO impl.MetricsConfig: Loaded properties from hadoop-metrics2.properties
2025-04-15 16:57:03,921 INFO impl.MetricsSystemImpl: Scheduled Metric snapshot period at 10 second(s).
2025-04-15 16:57:03,921 INFO impl.MetricsSystemImpl: JobTracker metrics system started
2025-04-15 16:57:04,087 INFO db.DBInputFormat: Using read commited transaction isolation
2025-04-15 16:57:04,096 INFO mapreduce.JobSubmitter: number of splits:1
2025-04-15 16:57:04,188 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_local2073730221_0001
2025-04-15 16:57:04,188 INFO mapreduce.JobSubmitter: Executing with tokens: []
2025-04-15 16:57:04,325 INFO mapred.LocalDistributedCacheManager: Creating symlink: /data/tmp/hadoop/mapred/local/job_local2073730221_0001_612a3a92-6e9b-4750-8da1-17ffe2e06983/libjars <- /opt/datasophon/hadoop-3.4.1/libjars/*
2025-04-15 16:57:04,327 WARN fs.FileUtil: Command 'ln -s /data/tmp/hadoop/mapred/local/job_local2073730221_0001_612a3a92-6e9b-4750-8da1-17ffe2e06983/libjars /opt/datasophon/hadoop-3.4.1/libjars/*' failed 1 with: ln: failed to create symbolic link ‘/opt/datasophon/hadoop-3.4.1/libjars/*’: No such file or directory2025-04-15 16:57:04,327 WARN mapred.LocalDistributedCacheManager: Failed to create symlink: /data/tmp/hadoop/mapred/local/job_local2073730221_0001_612a3a92-6e9b-4750-8da1-17ffe2e06983/libjars <- /opt/datasophon/hadoop-3.4.1/libjars/*
2025-04-15 16:57:04,327 INFO mapred.LocalDistributedCacheManager: Localized file:/tmp/hadoop/mapred/staging/hdfs2073730221/.staging/job_local2073730221_0001/libjars as file:/data/tmp/hadoop/mapred/local/job_local2073730221_0001_612a3a92-6e9b-4750-8da1-17ffe2e06983/libjars
2025-04-15 16:57:04,366 INFO mapreduce.Job: The url to track the job: http://localhost:8080/
2025-04-15 16:57:04,366 INFO mapreduce.Job: Running job: job_local2073730221_0001
2025-04-15 16:57:04,367 INFO mapred.LocalJobRunner: OutputCommitter set in config null
2025-04-15 16:57:04,371 INFO output.PathOutputCommitterFactory: No output committer factory defined, defaulting to FileOutputCommitterFactory
2025-04-15 16:57:04,372 INFO output.FileOutputCommitter: File Output Committer Algorithm version is 2
2025-04-15 16:57:04,372 INFO output.FileOutputCommitter: FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
2025-04-15 16:57:04,373 INFO mapred.LocalJobRunner: OutputCommitter is org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter
2025-04-15 16:57:04,406 INFO mapred.LocalJobRunner: Waiting for map tasks
2025-04-15 16:57:04,406 INFO mapred.LocalJobRunner: Starting task: attempt_local2073730221_0001_m_000000_0
2025-04-15 16:57:04,422 INFO output.PathOutputCommitterFactory: No output committer factory defined, defaulting to FileOutputCommitterFactory
2025-04-15 16:57:04,422 INFO output.FileOutputCommitter: File Output Committer Algorithm version is 2
2025-04-15 16:57:04,422 INFO output.FileOutputCommitter: FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
2025-04-15 16:57:04,433 INFO mapred.Task: Using ResourceCalculatorProcessTree : [ ]
2025-04-15 16:57:04,517 INFO db.DBInputFormat: Using read commited transaction isolation
2025-04-15 16:57:04,519 INFO mapred.MapTask: Processing split: 1=1 AND 1=1
2025-04-15 16:57:04,523 INFO mapred.LocalJobRunner: map task executor complete.
2025-04-15 16:57:04,528 WARN mapred.LocalJobRunner: job_local2073730221_0001
java.lang.Exception: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class HYSH_CX_NCME_CREDIT_TYPE not foundat org.apache.hadoop.mapred.LocalJobRunner$Job.runTasks(LocalJobRunner.java:492)at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:552)
Caused by: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class HYSH_CX_NCME_CREDIT_TYPE not foundat org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2737)at org.apache.sqoop.mapreduce.db.DBConfiguration.getInputClass(DBConfiguration.java:403)at org.apache.sqoop.mapreduce.db.OracleDataDrivenDBInputFormat.createDBRecordReader(OracleDataDrivenDBInputFormat.java:66)at org.apache.sqoop.mapreduce.db.DBInputFormat.createRecordReader(DBInputFormat.java:266)at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.<init>(MapTask.java:528)at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:771)at org.apache.hadoop.mapred.MapTask.run(MapTask.java:348)at org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:271)at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)at java.util.concurrent.FutureTask.run(FutureTask.java:266)at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)at java.lang.Thread.run(Thread.java:750)
Caused by: java.lang.ClassNotFoundException: Class HYSH_CX_NCME_CREDIT_TYPE not foundat org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2641)at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2735)... 12 more
2025-04-15 16:57:05,369 INFO mapreduce.Job: Job job_local2073730221_0001 running in uber mode : false
2025-04-15 16:57:05,370 INFO mapreduce.Job: map 0% reduce 0%
2025-04-15 16:57:05,371 INFO mapreduce.Job: Job job_local2073730221_0001 failed with state FAILED due to: NA
2025-04-15 16:57:05,376 INFO mapreduce.Job: Counters: 0
2025-04-15 16:57:05,379 WARN mapreduce.Counters: Group FileSystemCounters is deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead
2025-04-15 16:57:05,380 INFO mapreduce.ImportJobBase: Transferred 0 bytes in 1.5304 seconds (0 bytes/sec)
2025-04-15 16:57:05,380 WARN mapreduce.Counters: Group org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.apache.hadoop.mapreduce.TaskCounter instead
2025-04-15 16:57:05,380 INFO mapreduce.ImportJobBase: Retrieved 0 records.
2025-04-15 16:57:05,380 ERROR tool.ImportTool: Import failed: Import job failed!
修改方式
修改一下这个mapred-site.xml,然后重启
<property><name>mapreduce.framework.name</name><value>yarn</value></property>
不行看这里