Redis-bigkeys

来自linux中国网wiki
跳到导航 跳到搜索

Redis批量删除key

inof

172.16.200.12:7002> cluster nodes
686cfc039ab7c2c9b55bebff0b97eb9bb4060f78 172.16.20.7:[email protected] master - 0 1591844242096 9 connected 5461-10922
d73e3338dfebb963967135bd73fe10bd459f9e65 172.16.20.7:[email protected] master - 0 1591844245101 5 connected 10923-16383

518ca649f51c01498b3ab42b9e62c40f921f6327 172.16.20.12:[email protected] master - 0 1591844242000 8 connected 0-5460
4c90517a70df13b37752c9199f9ba6da15682ff4 172.16.20.12:[email protected] myself,slave 686cfc039ab7c2c9b55bebff0b97eb9bb4060f78 0 1591844244000 3 connected


242902c61a434edf2efc15909e7fa33e930351c9 172.16.20.13:[email protected] slave d73e3338dfebb963967135bd73fe10bd459f9e65 0 1591844243097 5 connected
909a54d51117d2e8bb587656db37c8def525e472 172.16.20.13:[email protected] slave 518ca649f51c01498b3ab42b9e62c40f921f6327 0 1591844244098 8 connected
172.16.200.12:702> 


此机为 172.16.20.12 703  master   , 7002 是slave  他的 master 是哪个  原来是   mq3  172.16.20.7:7005

redis_cluster]# du  -sh  data/*
2.7G	data/appendonly-7002.aof
2.4G	data/appendonly-7003.aof
1.5G	data/dump_7002.rdb
144K	data/dump_7003.rdb

pre

有以下几种办法可以知道某个Redis实例是否存在大key:

   在redis实例上执行bgsave,然后我们对dump出来的rdb文件进行分析,找到其中的大KEY
   有个不太推荐的命令,debug object xxx 可以看到这个key在内存中序列化后的大小,当然我们可以通过SCAN+debug object xxx 得到当前实例所有key的大小。
   redis-cli 原生自带 –bigkeys 功能,可以找到某个实例 5种数据类型(String、hash、list、set、zset)的最大key。
   最终 redis-cli-new

redis-cli --bigkeys

 debug  object  key 
redis-cli 原生自带 –bigkeys 功能,可以找到某个实例 5种数据类型(String、hash、list、set、zset)的最大key。

edis的--bigkeys参数:对redis整个keyspace进行统计(数据量大时采样,调用scan命令),寻找每种数据类型较大的keys,给出数据统计
redis-cli --bigkeys -i 0.1 -h 127.0.0.1


第一步 

第一次执行完   会高CPU 

 172.16.20.12 -p 7003

 redis-cli  -c -h 172.16.20.12 -p 7003 -a pass --bigkeys     
Warning: Using a password with '-a' or '-u' option on the command line interface may not be safe.

# Scanning the entire keyspace to find biggest keys as well as
# average sizes per key type.  You can use -i 0.1 to sleep 0.1 sec
# per 100 SCAN commands (not usually needed).

[00.00%] Biggest string found so far 'sns_cache:userFollowLog:1296748078836308812708058410934272' with 1 bytes
[00.00%] Biggest string found so far 'uv:post:12987950080525078' with 59 bytes
[00.00%] Biggest hash   found so far 'posts:comment_user:12991077241267048:85412991077241267048' with 3 fields
[00.00%] Biggest string found so far 'sns_cache:five_latest_replies_v5:comment_id:12966682424339580' with 534 bytes
[00.00%] Biggest hash   found so far 'userFollowLog:12926532386375374' with 155 fields
[00.01%] Biggest string found so far 'uv:post:12995339861518175' with 933 bytes
[00.01%] Biggest hash   found so far 'postsPraiseLog:12947079027599160' with 12431 fields
[00.01%] Biggest string found so far 'sns_cache:five_latest_replies_v5:comment_id:12873404201224716' with 1575 bytes
[00.02%] Biggest set    found so far 'posts_comment_reply:user_post:12948355526503870' with 421 members
[00.03%] Biggest string found so far 'sns_cache:five_latest_replies_v5:comment_id:12942125413091883' with 2136 bytes
[00.04%] Biggest hash   found so far 'postsPraiseLog:12923869855985512' with 25959 fields
[00.12%] Biggest hash   found so far 'postsPraiseLog:12783531839545344' with 74653 fields
[00.13%] Biggest string found so far 'sns_cache:five_latest_replies_v5:comment_id:12952979705119705' with 2605 bytes
[00.16%] Biggest string found so far 'sns_cache:five_latest_replies_v5:comment_id:12978438783438619' with 2881 bytes
[00.18%] Biggest string found so far 'sns_cache:five_latest_replies_v5:comment_id:13011992277674709' with 3017 bytes
[00.24%] Biggest set    found so far 'posts_comment_reply:user_post:12784388711768064' with 731 members
[00.24%] Biggest list   found so far 'robot:operation_list:like:oTIpr4SkvLsZ' with 100 items
[00.49%] Biggest string found so far 'sns_cache:five_latest_replies_v5:comment_id:12923545866027011' with 3812 bytes
[00.61%] Biggest string found so far 'uv:post:12984703254167595' with 12304 bytes
[02.06%] Biggest list   found so far 'robot:operation_list:follow:ogSLwcZEpvRz' with 102 items
[02.99%] Biggest list   found so far 'robot:operation_list:follow:L0rtLUbHQEFC' with 209 items
[04.07%] Biggest set    found so far 'posts_comment_reply:user_post:12796982571974656' with 826 members
[04.21%] Biggest hash   found so far 'postsPraiseLog:12907301092746796' with 85186 fields
[07.38%] Biggest hash   found so far 'postsPraiseLog:12935131931853999' with 103381 fields
[07.97%] Biggest hash   found so far 'postsPraiseLog:12817607428722673' with 107824 fields
[08.18%] Sampled 1000000 keys so far
[10.78%] Biggest set    found so far 'sns_cache:5dfad08b5f9b1413912353:standard_ref' with 1241 members
[16.37%] Sampled 2000000 keys so far
[24.55%] Sampled 3000000 keys so far
[25.46%] Biggest list   found so far 'robot:operation_list:like:n9gok0IcHqgb' with 298 items
[27.36%] Biggest hash   found so far 'postsPraiseLog:12786292153868288' with 131584 fields
[27.71%] Biggest string found so far 'area_code:list' with 15464 bytes
[30.35%] Biggest string found so far 'groupMatchRankCache' with 16320 bytes
[32.26%] Biggest hash   found so far 'postsPraiseLog:12896706894004677' with 140077 fields
[32.73%] Sampled 4000000 keys so far
[36.49%] Biggest hash   found so far 'postsPraiseLog:12786516420624384' with 193502 fields
[39.84%] Biggest string found so far 'stimulateUserStatisticsListKey' with 24063 bytes
[40.92%] Sampled 5000000 keys so far
[43.44%] Biggest list   found so far 'robot:operation_list:follow:AsNltfXCtZPR' with 300 items
[49.10%] Sampled 6000000 keys so far
[49.54%] Biggest set    found so far 'sns_cache:5dfad630a5251220444421:standard_ref' with 1255 members
[57.29%] Sampled 7000000 keys so far
[65.47%] Sampled 8000000 keys so far
[73.07%] Biggest zset   found so far 'groupAuction:socket' with 12 members
[73.65%] Sampled 9000000 keys so far
[81.84%] Sampled 10000000 keys so far
[85.05%] Biggest list   found so far 'buyback:buybackShareQueue:20200611' with 600 items
[90.02%] Sampled 11000000 keys so far
[98.20%] Sampled 12000000 keys so far

-------- summary -------

Sampled 12040461 keys in the keyspace!
Total key length in bytes is 743432861 (avg len 61.74)

Biggest   list found 'buyback:buybackShareQueue:20200611' has 600 items
Biggest   hash found 'postsPraiseLog:12784108337438720' has 219951 fields
Biggest string found 'stimulateUserStatisticsListKey' has 24063 bytes
Biggest    set found 'sns_cache:5dfad630a5251220444421:standard_ref' has 1255 members
Biggest   zset found 'groupAuction:socket' has 12 members

134 lists with 12778 items (00.00% of keys, avg size 95.36)
137285 hashs with 49491661 fields (01.14% of keys, avg size 360.50)
11899007 strings with 45722370 bytes (98.83% of keys, avg size 3.84)
0 streams with 0 entries (00.00% of keys, avg size 0.00)
4034 sets with 494122 members (00.03% of keys, avg size 122.49)
1 zsets with 12 members (00.00% of keys, avg size 12.00)


输出大概分为两部分,summary之上的部分,只是显示了扫描的过程。summary部分给出了每种数据结构中最大的Key。

第二步 用 debug  object  

 debug object postsPraiseLog:12784108337438720
Value at:0x7fbeb046e110 refcount:1 encoding:hashtable serializedlength:4395140 lru:14786916 lru_seconds_idle:25

172.16.20.12:7003> debug object stimulateUserStatisticsListKey
Value at:0x7fbcde2332b0 refcount:1 encoding:raw serializedlength:10675 lru:14786772 lru_seconds_idle:319
172.16.20.12:7003> strlen stimulateUserStatisticsListKey
(integer) 24063



第三方tool

redis-cli-new (成功)

#例子为 最大前3个 
redis-cli-new -p 7000 --bigkeys --bigkey-numb  3  


通过Redis-cli –bigkeys 我们可以很方便的找到某个实例最大的几个KEY,但是只能得到某种类型的最大的一个key,于是思考改改redis-cli findBigKeys 功能,增加查找多个key的代码,用户可以指定大key的数量。

修改后功能预览如下:

VITOXIE-MB1:src xiean$ ./redis-cli-new -p 2837 --bigkeys --bigkey-numb  3

Biggest string Key Top   1  found 'xxxG_NEWMATCH_VOD_DATA_7f7a2a2fb5f780a13fecd9f1e51bdf8a' has 53170 bytes
Biggest string Key Top   2  found 'xxxG_NEWMATCH_VOD_DATA_a9758560d1874493c637dec0753909da' has 53159 bytes
Biggest string Key Top   3  found 'xxxG_NEWMATCH_VOD_DATA_d0971977b0ce028141e53b020b93d822' has 53156 bytes
Biggest   list Key Top   1  found 'UserPostInfo122_632789064' has 11028 items
Biggest   list Key Top   2  found 'xxxG_FriendCallBack_PushList_23' has 1973 items
Biggest   list Key Top   3  found 'xxxG_FriendCallBack_PushList_20' has 1824 items

ps,修改的源码放在GitHub上,这里还部分dba日常实用工具:https://github.com/xiepaup/OPS-Tools

https://github.com/xiepaup/dbatools

godis-cli-bigkey

go run godis-cli-bigkey.go

$GOROOT
$GOPATH

vi /etc/profile
export PATH=/data/apps/go/bin/:$PATH
export GOPATH=/root/go/  #(可选设置) 如果你用root运行 
export GOROOT=/data/apps/go/
#export GOARCH=amd64
#export GOOS=linux
export GOTOOLS=$GOROOT/pkg/tool
#export PATH=$PATH:$GOROOT/bin:$GOPATH/bin

#export PATH=$PATH:/usr/local/gobin

source /etc/profile


ERR 

# go run godis-cli-bigkey.go
godis-cli-bigkey.go:8:2: cannot find package "github.com/erpeng/godis-cli-bigkey/pool" in any of:
	/data/apps/go/src/github.com/erpeng/godis-cli-bigkey/pool (from $GOROOT)
	/root/go/src/github.com/erpeng/godis-cli-bigkey/pool (from $GOPATH)
godis-cli-bigkey.go:9:2: cannot find package "github.com/erpeng/godis-cli-bigkey/rdb" in any of:
	/data/apps/go/src/github.com/erpeng/godis-cli-bigkey/rdb (from $GOROOT)
	/root/go/src/github.com/erpeng/godis-cli-bigkey/rdb (from $GOPATH)


go mod init  godis-cli-bigkey#就好了  



Get "https://proxy.golang.org/github.com/erpeng/godis-cli-bigkey/pool/@v/list": dial tcp 34.64.4.113:443: i/o timeout
godis-cli-bigkey.go:9:2: module github.com/erpeng/godis-cli-bigkey/rdb: Get "https://proxy.golang.org/github.com/erpeng/godis-cli-bigkey/rdb/@v/list": dial tcp 34.64.4.113:443: i/o timeout
[[email protected] godis-cli-bigkey]# 
[[email protected] godis-cli-bigkey]# go env -w GO111MODULE=on
[[email protected] godis-cli-bigkey]# go env -w GOPROXY=https://mirrors.aliyun.com/goproxy/,direct



Usage 


mv  rdb.rdb  rdb.rdbbak
 mv dump_7002.rdb  rdb.rdb


结果  

key:k1,valueSize:9,valueType:0,expireTime:1549533396795,lfu:0,lru:0
key:key,valueSize:9,valueType:0,expireTime:0,lfu:0,lru:0
key:ss1,valueSize:14,valueType:2,expireTime:0,lfu:0,lru:0
key:si1,valueSize:23,valueType:11,expireTime:0,lfu:0,lru:0
key:l1,valueSize:28,valueType:14,expireTime:1549537004535,lfu:0,lru:0
key:h1,valueSize:33,valueType:13,expireTime:0,lfu:0,lru:0
key:z1,valueSize:67,valueType:12,expireTime:0,lfu:0,lru:0
key:testzset,valueSize:1303,valueType:5,expireTime:0,lfu:0,lru:0
key:h3,valueSize:8845,valueType:13,expireTime:0,lfu:0,lru:0
key:h2,valueSize:11680,valueType:4,expireTime:0,lfu:0,lru:0
key:h4,valueSize:11703,valueType:4,expireTime:0,lfu:0,lru:0


go  run  godis-cli-bigkey.go  -topn  40
Rdb Version:0009
key:k1,valueSize:9,valueType:0,expireTime:1549533396795,lfu:0,lru:0
key:key,valueSize:9,valueType:0,expireTime:0,lfu:0,lru:0
key:ss1,valueSize:14,valueType:2,expireTime:0,lfu:0,lru:0
key:si1,valueSize:23,valueType:11,expireTime:0,lfu:0,lru:0
key:l1,valueSize:28,valueType:14,expireTime:1549537004535,lfu:0,lru:0
key:h1,valueSize:33,valueType:13,expireTime:0,lfu:0,lru:0
key:z1,valueSize:67,valueType:12,expireTime:0,lfu:0,lru:0
key:testzset,valueSize:1303,valueType:5,expireTime:0,lfu:0,lru:0
key:h3,valueSize:8845,valueType:13,expireTime:0,lfu:0,lru:0
key:h2,valueSize:11680,valueType:4,expireTime:0,lfu:0,lru:0
key:h4,valueSize:11703,valueType:4,expireTime:0,lfu:0,lru:0

https://github.com/erpeng/godis-cli-bigkey

redis_rdb_tools

mkdir   ~/.pip/
 vi ~/.pip/pip.conf

[global]
index-url = http://pypi.douban.com/simple
[install]
trusted-host=pypi.douban.com

 pip install rdbtools

pip install python-lzf


生成内存报告

rdb -c memory  rdb.rdb   --bytes 128 -f dump_memory_mq2_7003.csv


按键值大小排序

awk -F',' '{print $4,$2,$3,$1}' dump_memory.csv | sort  > dump_memory_csv.sort


awk -F',' '{print $4,$2,$3,$1}' dump_memory_mq2_7003.csv | sort  > dump_memory_mq2_7003_csv.sort

database,type,key,size_in_bytes,encoding,num_elements,len_largest_element,expiry

Redis工具之redis_rdb_tools


使用rdbtools工具来解析redis rdb文件


https://blog.csdn.net/u010522235/article/details/89241765

https://www.jianshu.com/p/c885af575f97

https://www.cnblogs.com/yqzc/p/12425533.html

rdb_bigkeys工具

 git clone https://github.com/weiyanwei412/rdb_bigkeys.git
 cd rdb_bigkeys

 go mod init  rdb_bigkeys
 go get 
go build

行完成生成可执行文件rdb_bigkeys。
使用方法: ./rdb_bigkeys --bytes 1024 --file bigkeys.csv --sep 0 --sorted --threads 4 /home/redis/dump.rdb
/home/redis/dump.rdb修改为实际的文件路径

see also

https://redis.io/topics/rediscli

分析redis key大小的几种方法


【Redis源码分析】如何在Redis中查找大key

Redis中如何发现并优化big key?

分析redis key大小的几种方法

Redis Value过大问题 键值过大


解决Redis大key问题,看这一篇文章就够了