The Linux Foundation last week announced that it will host Valkey, a fork of the Redis in-memory data store.
This fork originated at AWS, where longtime Redis maintainer Madelyn Olson initially started the project in her own GitHub account.
Olson told me that when the news broke, a lot of the current Redis maintainers quickly decided that it was time to move on.
“When the news broke, everyone was just like, ‘Well, we’re not going to go contribute to this new license,’ and so as soon as I talked to everyone, ‘Hey, I have this fork — we’re trying to keep the old group together,'” she said, “pretty much everyone was like, ‘yeah, I’m immediately on board.”
Redis’s announcement came right in the middle of the European version of the Cloud Native Computing Foundation’s KubeCon conference, which was held in Paris this year.
One area Redis (the company) is investing in is moving beyond in-memory to also using flash storage, with RAM as a large, high-performance cache.
The original article contains 1,171 words, the summary contains 167 words. Saved 86%. I’m a bot and I’m open source!
This is the best summary I could come up with:
The Linux Foundation last week announced that it will host Valkey, a fork of the Redis in-memory data store.
This fork originated at AWS, where longtime Redis maintainer Madelyn Olson initially started the project in her own GitHub account.
Olson told me that when the news broke, a lot of the current Redis maintainers quickly decided that it was time to move on.
“When the news broke, everyone was just like, ‘Well, we’re not going to go contribute to this new license,’ and so as soon as I talked to everyone, ‘Hey, I have this fork — we’re trying to keep the old group together,'” she said, “pretty much everyone was like, ‘yeah, I’m immediately on board.”
Redis’s announcement came right in the middle of the European version of the Cloud Native Computing Foundation’s KubeCon conference, which was held in Paris this year.
One area Redis (the company) is investing in is moving beyond in-memory to also using flash storage, with RAM as a large, high-performance cache.
The original article contains 1,171 words, the summary contains 167 words. Saved 86%. I’m a bot and I’m open source!