From: jkhosali@nps.edu (Jean Khosalim) Date: Tue, 14 Feb 2012 08:24:20 -0800 Subject: [refpolicy] SELinux policy for Hadoop In-Reply-To: <4F3A6EE5.5010305@redhat.com> References: <001801cce698$0bd44560$237cd020$@edu> <4F32D102.3060605@tresys.com> <002601cce6a0$e2e7ce20$a8b76a60$@edu> <4F32DDA1.3050901@redhat.com> <002701cce6a4$ba1efdc0$2e5cf940$@edu> <4F34184B.5030106@redhat.com> <000e01cce761$4adb45f0$e091d1d0$@edu> <4F3441D0.508@redhat.com> <000301ccea96$289502f0$79bf08d0$@edu> <4F39843D.70202@redhat.com> <000d01ccea9e$6ce2b170$46a81450$@edu> <4F3A6EE5.5010305@redhat.com> Message-ID: <000101cceb35$24a09700$6de1c500$@edu> To: refpolicy@oss.tresys.com List-Id: refpolicy.oss.tresys.com > > Then /usr/lib/hadoop-0.20/bin/hadoop script (labeled > > system_u:object_r:hadoop_exec_t:s0) invoke java: nohup su > > $HADOOP_DAEMON_USER -s $JAVA -- -Dproc_$COMMAND_JAVA..... > > > Ok what label does this run as? The 'su' processes seem to run as 'system_u:system_r:initrc_t:s0'. The actual java processes run as 'system_u:system_r:unconfined_java_t:s0' The following is the output of 'ps auxZ | grep java' (with portion of the ps line replaced with '.....' because it is too long): ----- Begin output of 'ps auxZ | grep java' ------ system_u:system_r:initrc_t:s0 root 1107 0.0 0.2 7808 2180 ? S 10:44 0:00 su mapred -s /usr/java/jdk1.6.0_30/bin/java -- -Dproc_tasktracker ..... org.apache.hadoop.mapred.TaskTracker system_u:system_r:initrc_t:s0 root 1109 0.0 0.2 7812 2188 ? S 10:44 0:00 su mapred -s /usr/java/jdk1.6.0_30/bin/java -- -Dproc_jobtracker ..... org.apache.hadoop.mapred.JobTracker system_u:system_r:initrc_t:s0 root 1111 0.0 0.2 7812 2188 ? S 10:44 0:00 su hdfs -s /usr/java/jdk1.6.0_30/bin/java -- -Dproc_secondarynamenode ..... org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode system_u:system_r:initrc_t:s0 root 1113 0.0 0.2 7812 2192 ? S 10:44 0:00 su hdfs -s /usr/java/jdk1.6.0_30/bin/java -- -Dproc_datanode ..... org.apache.hadoop.hdfs.server.datanode.DataNode system_u:system_r:initrc_t:s0 root 1115 0.0 0.2 7812 2184 ? S 10:44 0:00 su hdfs -s /usr/java/jdk1.6.0_30/bin/java -- -Dproc_namenode ..... org.apache.hadoop.hdfs.server.namenode.NameNode system_u:system_r:unconfined_java_t:s0 mapred 1130 1.1 4.1 1197024 42552 ? Sl 10:44 0:06 java -Dproc_jobtracker ..... org.apache.hadoop.mapred.JobTracker system_u:system_r:unconfined_java_t:s0 hdfs 1131 1.1 6.3 1197864 64808 ? Sl 10:44 0:05 java -Dproc_namenode ..... org.apache.hadoop.hdfs.server.namenode.NameNode system_u:system_r:unconfined_java_t:s0 hdfs 1132 1.0 6.1 1191856 62752 ? Sl 10:44 0:05 java -Dproc_secondarynamenode ..... org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode system_u:system_r:unconfined_java_t:s0 mapred 1133 1.3 4.1 1195780 42856 ? Sl 10:44 0:07 java -Dproc_tasktracker ..... org.apache.hadoop.mapred.TaskTracker system_u:system_r:unconfined_java_t:s0 hdfs 1134 1.1 4.1 1194756 42528 ? Sl 10:44 0:05 java -Dproc_datanode ..... org.apache.hadoop.hdfs.server.datanode.DataNode ----- End output of 'ps auxZ | grep java' ------ > > > > If I try to run: runcon -t hadoop_t su hdfs -s > > /usr/java/jdk1.6.0_30/bin/java -- -Dproc_$COMMAND_JAVA..... I got > > runcon: invalid contect: unconfined_u: > > unconfined_r:hadoop_t:s0-s0:c0.c1023: Invalid argument. > > > Try > > runcon system_u:system_r:hadoop_t:s0 su hdfs -s > /usr/java/jdk1.6.0_30/bin/java -- I got the following error when I run the above: runcon: invalid context: system_u:system_r:hadoop_t:s0: Invalid argument Thanks, Jean Khosalim