Replies: 1 comment
-
这个可以的。直接批量导入就行,不过80亿数据,你可能要算下磁盘的容量 |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I have 8 billion URL data, these URLs have a timestamp field. I only need to keep the unique URLs from the last 5 days. Is there any issue using Doris for this task? How does it compare to using Redis with Bloom filters?
80亿的url数据, 每五天的量, 想要用doris单机 Unique写时合并来做, 只存储两个字段, 一个是URL,一个是日期
每天写入过去五天的, 然后把5天前的删除掉, 用doris可以吗, 还是说用redis+布隆过滤器好, 还是用pyspark
Beta Was this translation helpful? Give feedback.
All reactions