mirror of
https://github.com/dani-garcia/vaultwarden.git
synced 2025-12-10 17:23:04 +03:00
Vault data disappears when upgrading 1.23.0 -> 1.25.0 #947
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @vitorbaptista on GitHub.
Subject of the issue
I have a Vaultwarden deployment using docker-compose currently on 1.23.0. I tried upgrading to 1.27.0, but then my vault was empty (I was able to login though). I tried all versions between them, and the only that worked was 1.24.0.
Deployment environment
Install method: Docker-compose
Clients used: Web Vault
Other relevant details:
Steps to reproduce
I'm not sure if there's something on my installation, but I guess you could reproduce it by:
Expected behaviour
All passwords in the vault would be the there.
Actual behaviour
The vault is empty
Troubleshooting data
These are the logs on Vaultwarden 1.27.0, but the error is the same in 1.25.0 and 1.26.0.
@BlackDex commented on GitHub:
You can use the admin interface to see the amount of items.
I would really like to know the amount so that i can try to replicate.
@BlackDex commented on GitHub:
That is a lot. Probably not all your ciphers i think.
The admin interface is
vw.domain.tld/admin.@BlackDex commented on GitHub:
May i ask how many vault items you have? (You can see this in the admin environment, please provide both personal and all orgs items you are a member of)
It looks like you have so many items that causes the query to be overloaded.
Also, could you try the Alpine based images to see if that does work?
@vitorbaptista commented on GitHub:
@BlackDex I couldn't get to the admin interface, but I queried the DB directly:
Much more than I expected. Does that work for you? Or is there another query I can run on the DB that would help?
@vitorbaptista commented on GitHub:
@BlackDex thanks for the quick reply
I couldn't find the number of vault items, but it's in the thousands (I guess between 2,000 ~ 5,000). Interestingly, I tried logging in with a different user that had much less vault items and it worked fine.
I tried the Alpine
1.27.0-alpineand I see the same error in the logs, but it shows the vault names. Maybe the Alpine version supports more vault items?@BlackDex commented on GitHub:
@stefan0xC if I'm correct, from the top of my head, that optimization i build was after a newer version of the sqlite library. Since we always use a vendored/build-in version it shouldn't be an issue. Unless someone removes that vendored option of course.
While it's nice and good for most. It breaks for at least one. While i do think it's a lot of ciphers, i still think i should take a look at it.
@BlackDex commented on GitHub:
Well, it looks the maximum of elements sqlite supports by default is
32766. That is less then the amount of ciphers you reported.I'll have to look into this and see if we can solve this in a decent way without slowing everything down again.
I also wonder how a key rotation will perform, because that will probably take a long time, and will also cause a lot of queries.
I also wonder if this breaks on MySQL or PostgreSQL.
@BlackDex commented on GitHub:
That is an issue, because that is used in the newer versions to speed-up the sync process.
I might need to revisit that approach, or limit the amount and merge in-code.
The speed-up was about 3x quicker sync then before if not more.
But if that causes very very large vaults to not being able to sync. That is an issue of course.
I didn't had time yet to reproduce and look at this.
@stefan0xC commented on GitHub:
If I understood the issue correctly this line creates an IN statement that is incompatible with SQLite if the number of ciphers gets too large:
367e1ce289/src/db/models/attachment.rs (L193)According to https://www.sqlite.org/limits.html#max_variable_number the maximum was only
999until 3.32.0 so in theory some users (that don't use the docker image and build the binary with an older SQLite) could be more affected by this (I think 999 would still be a large number of ciphers but should be a bit easier to reach than 32766 for most users).@sorcix commented on GitHub:
That's the limit on number of parameters (binding
?or:varin a query). That shouldn't be an issue, right?@BlackDex commented on GitHub:
I Think i have found a good solution. Which may be nicer also.
I may do some more changes to see if we can improve the performance.
Also, @stefan0xC it is actually very easy to lower the limit.
@stefan0xC commented on GitHub:
Ah, okay. I just assumed the sqlite version in use would depend on the build platform while I could have looked in the
Cargo.tomlfile instead. 🤦(I was initially also wondering if it might be worth exploring that option to reproduce the issue more easily but I think it does not matter as even a limit of 999 is so large that it should be automated either way...)
Ah, yeah I was just wondering how to get so many entries (as I store almost all my credentials to Vaultwarden myself and I am nowhere near that).
@vitorbaptista May I ask how you got that many ciphers? Did you test something or do you maybe have an automated script? (If so you could maybe share it or a reworked version so we can more easily reproduce that issue).
I was also thinking if maybe switching to a more robust database backend like Postgres or MariaDB (which according to many answers on StackExchange apparently don't have such a "low" limit like SQLite) might be a workaround for you in the meantime (until there is a fix) but I've not tested it myself.
@BlackDex commented on GitHub:
Ok, PR done. It should solve your issue, and it looks like i shaved off a bit more time it takes to sync. Not much, but every bit counts. Especially with a huge cipher base 😉 .
Compared to the version you are currently running and the one with this patch, you wont have time to get a ☕ .
@vitorbaptista commented on GitHub:
@stefan0xC We use it to store third-parties' credentials. It's an automated process that use the
bwCLI to add/update passwords into Vaultwarden. We also have a staging and production organization to test this process, doubling the number of passwords we have.Regarding migrating to another DB, that might be a better option. However, at this point, I think we'd better bite the bullet and use a Bitwarden.com organization, as we don't have that many users to begin with. I wonder if they would be able to handle this number of passwords, though.
@BlackDex commented on GitHub:
Good test environment for Vaultwarden haha.
@vitorbaptista commented on GitHub:
@BlackDex Sorry to bother, but do you have any ETA on when a new release is going to be done? Looking forward to trying your fix.
@BlackDex commented on GitHub:
Probably this weekend somewhere
@vitorbaptista commented on GitHub:
That's awesome, @BlackDex! I'll keep an eye on when this is released. Thank you for the quick turnaround.
@vitorbaptista commented on GitHub:
@BlackDex hey, I've been checking to see when this would be released. If you don't mind, I'd rather wait for the next release, given that this is a key infrastructure of our company. Hopefully it won't take too long. I'll ping you here with the results
@BlackDex commented on GitHub:
@vitorbaptista I'm curious to known if this solution works for you, and what your feeling is on the loading part.
If you could try out the
testingtagged image, that would be great!@vitorbaptista commented on GitHub:
@BlackDex As promised, I did some quick performance comparisons. I was using version 1.23.0 before, upgraded to 1.28.0. The load time of the main page (just after logging in) is the same, 28 seconds. However, the load time of the
/sync?excludeDomains=trueendpoint went from 10.44s to 2.61s, a whopping 75% reduction!!!The amount downloaded in that endpoint increased a bit, from 18.61 MB to 19.22 MB.
None of these checks were in any way scientific, I just used my regular clock to do the timings, and Firefox's Network tab to check the
/syncendpoint size and timing.Thanks a lot for your work! The bug is solved, and I now can resume using the Bitwarden apps.