Hadoop审计日志配置
修改hadoop-env.sh中 HADOOP_NAMENODE_OPTS中hdfs.audit.logger配置 export HADOOP_NAMENODE_OPTS=.... -Dhdfs.audit.logger=${HDFS_AUDIT_LOGGER:-INFO,RFAAUDIT} $HADOOP_NAMENODE_OPTS 日志输出namenode主机logs/hdfs-audit.log 2014-04-30 10:19:13,173 INFO FSNamesystem.audit: allowed=true ugi=cdh5 (auth:SIMPLE) ip=/10.1.251.52 cmd=create src=/a._COPYING_ dst=null perm=cdh5:supergroup:rw-r--r-- ugi <user>,<group>[,<group>]* ip <client ip address> cmd (open|create|delete|rename|mkdirs|listStatus|setReplication|setOwner|setPermission) src <path> dst (<path>|"null") perm (<user>:<group>:<perm mask>|"null")
日志输出在resourcemanager主机的logs/mapred-audit.log 格式如下: 2014-04-30 10:35:09,595 INFO resourcemanager.RMAuditLogger: USER=cdh5 IP=10.1.251.52 OPERATION=Submit Application RequestTARGET=ClientRMService RESULT=SUCCESS APPID=application_1398825288110_0001
元数据查询修改审计: log4j.appender.HAUDIT=org.apache.log4j.DailyRollingFileAppender log4j.appender.HAUDIT.File=${hive.log.dir}/hive_audit.log log4j.appender.HAUDIT.DatePattern=.yyyy-MM-dd log4j.appender.HAUDIT.layout=org.apache.log4j.PatternLayout log4j.appender.HAUDIT.layout.ConversionPattern=%d{ISO8601} %-5p %c{2} (%F:%M(%L)) - %m%n log4j.logger.org.apache.hadoop.hive.metastore.HiveMetaStore.audit=INFO,HAUDIT
日志文件:logs/hive_audit.log 日志格式: 2014-04-30 11:26:09,918 INFO HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(242)) - ugi=cdh5 ip=unknown-ip-addr cmd=get_database: default 2014-04-30 11:26:09,931 INFO HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(242)) - ugi=cdh5 ip=unknown-ip-addr cmd=get_tables: db=default pat=.* 2014-04-30 11:26:45,153 INFO HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(242)) - ugi=cdh5 ip=unknown-ip-addr cmd=get_table : db=default tbl=abc 2014-04-30 11:26:45,253 INFO HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(242)) - ugi=cdh5 ip=unknown-ip-addr cmd=get_table : db=default tbl=abc 2014-04-30 11:26:45,285 INFO HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(242)) - ugi=cdh5 ip=unknown-ip-addr cmd=get_table : db=default tbl=abc 2014-04-30 11:26:45,315 INFO HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(242)) - ugi=cdh5 ip=unknown-ip-addr cmd=drop_table : db=default tbl=abc
1)打开hbase security <property> <name>hbase.rpc.engine</name> <value>org.apache.hadoop.hbase.ipc.SecureRpcEngine</value> </property> <property> <name>hbase.coprocessor.master.classes</name> <value>org.apache.hadoop.hbase.security.access.AccessController</value> </property> <property> <name>hbase.coprocessor.region.classes</name> <value>org.apache.hadoop.hbase.security.token.TokenProvider,org.apache.hadoop.hbase.security.access.AccessController</value> </property> <property> <name>hbase.superuser</name> <value>superuser-accout:cdh5</value> <!--指定superuser --> </property>
2)配置log4j.properties,打开Security audit appender,缺省有 hbase.security.log.file=SecurityAuth.audit hbase.security.log.maxfilesize=256MB hbase.security.log.maxbackupindex=20 log4j.appender.RFAS=org.apache.log4j.RollingFileAppender log4j.appender.RFAS.File=${hbase.log.dir}/${hbase.security.log.file} log4j.appender.RFAS.MaxFileSize=${hbase.security.log.maxfilesize} log4j.appender.RFAS.MaxBackupIndex=${hbase.security.log.maxbackupindex} log4j.appender.RFAS.layout=org.apache.log4j.PatternLayout log4j.appender.RFAS.layout.ConversionPattern=%d{ISO8601} %p %c: %m%n log4j.category.SecurityLogger=${hbase.security.logger} log4j.additivity.SecurityLogger=true log4j.logger.SecurityLogger.org.apache.hadoop.hbase.security.access.AccessController=TRACE
日志格式如下: 2014-06-10 16:09:53,319 TRACE SecurityLogger.org.apache.hadoop.hbase.security.access.AccessController: Access allowed for user cdh5; reason: Table permission granted; remote address: /10.1.251.152; request: deleteTable; context: (user=cdh5, scope=yqhtt, family=, action=ADMIN) 2014-06-10 16:09:53,356 TRACE SecurityLogger.org.apache.hadoop.hbase.security.access.AccessController: Access allowed for user cdh5; reason: All users allowed; remote address: /10.1.251.152; request: getClosestRowBefore; context: (user=cdh5, scope=hbase:meta, family=info, action=READ) 2014-06-10 16:09:53,403 TRACE SecurityLogger.org.apache.hadoop.hbase.security.access.AccessController: Access allowed for user cdh5; reason: Table permission granted; remote address: /10.1.251.152; request: delete; context: (user=cdh5, scope=hbase:meta, family=info:, action=WRITE) 2014-06-10 16:09:53,444 TRACE SecurityLogger.org.apache.hadoop.hbase.security.access.AccessController: Access allowed for user cdh5; reason: Table permission granted; remote address: /10.1.251.152; request: delete; context: (user=cdh5, scope=hbase:acl, family=l:, action=WRITE) 2014-06-10 16:09:53,471 TRACE SecurityLogger.org.apache.hadoop.hbase.security.access.AccessController: Access allowed for user cdh5; reason: All users allowed; remote address: /10.1.251.152; request: getClosestRowBefore; context: (user=cdh5, scope=hbase:meta, family=info, action=READ)
|
|