mirror of
https://github.com/dani-garcia/vaultwarden.git
synced 2025-12-11 09:13:02 +03:00
Importing silently fails without UI error for 10k character+ secure notes #963
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @meramsey on GitHub.
Originally assigned to: @BlackDex on GitHub.
Subject of the issue
Import errors are not handled the same as upstream for failing things. Which is confusing for people not tech savvy..
Your environment (Generated via diagnostics page)
Config (Generated via diagnostics page)
Show Running Config
Environment settings which are overridden: SIGNUPS_ALLOWED, INVITATIONS_ALLOWED, ADMIN_TOKEN
Steps to reproduce
Export a 1Password account with 10,000character+ secure note as 1pux file... or use attached.
Import 1password 1pux
browse to file
click import starts loading then shows console error
Example from attached file
https://user-images.githubusercontent.com/1596188/209586637-c2158710-9e9d-487a-8808-f25370fbc611.mp4
Test 1pux file to test with. Just will need to rename to remove the .zip as github doesn't allow 1pux
1PasswordExport-4WNLQ6BLP5HXFOIOEC62DSEZPQ-20221226-170653.1pux.zip
Expected behaviour
For it to show error like it does on main bitwarden where it shows errors per bad note with the name so you know what to fix before exporting again.
Example:

Import error
There was a problem with the data you tried to import. Please resolve the errors listed below in your source file and try again.
[1] [SecureNote] "Lorem ipsum 100 paragraphs large note": The field Notes exceeds the maximum encrypted value length of 10000 characters.
Error is Related to the limit upstream just not handled the same in the UI here:
https://www.reddit.com/r/Bitwarden/comments/sklxy5/why_limit_secure_notes_to_10000_characters/
https://community.bitwarden.com/t/support-longer-notes-breaks-lastpass-import/2970
Actual behaviour
Import looks like its working, but stops and fails when it gets to long note and does not say any error in main UI.
Response raw
Troubleshooting data
@dlehammer commented on GitHub:
As a new user, also affected by this symptom I would suggest the following:
A. Validation of file, this would allow for useful feedback to the user ~ as to which specific entries are blocking import and why.
B. When validation passes, the import is performed in a single transaction.
🤓
@dlehammer commented on GitHub:
Just discovered another surprise; re-import of the same file, generates duplicate folders and items, even though the import file contains identical data to already existing items.
Any tips on bulk cleaning up duplicates ?
@cksapp commented on GitHub:
Another similar previous discussion related to import of notes exceeding the maximum allowed value.
https://github.com/dani-garcia/vaultwarden/discussions/2176
https://github.com/dani-garcia/vaultwarden/issues/2937
I'm sure imposing the limit shouldn't be too difficult, but I wonder if there is any elegant way to "convert" any possible old notes that currently would be over the upstream value should Vaultwarden also honor the 10k encrypted limit to minimize breaking thing as much as possible.
@htunlogic commented on GitHub:
Why is there 10_000 limit anyway, and can we extend it?
If i googled correctly, the limit for TEXT field is 65535 characters.
That means we can get 65535 characters without changing the database schema.
Last pass allows a lot more characters so its impossible to import notes properly from last pass.
@meramsey commented on GitHub:
I just purged the vault data and reattempted. That is an upstream bug too but I think if it validates all items before actually restoring that would prevent that in first place. Not sure how feasible that is or if you can use a temp vault then nuke after it fails and then if it succeeds copy or do real import to main one.
@BlackDex commented on GitHub:
Might i suggest to use attachments in this case? You can put attachments to these items.
The only thing with that is, that they arn't synced, because you need to download them.
@htunlogic commented on GitHub:
@BlackDex You are correct. I haven't been really thinking about that... My only priority is to get the hell away from Lastpass :)
@htunlogic commented on GitHub:
Well, the only thing I'm having issue with really are the GPG keys I have stored in secure notes. I'll shuffle through them, some of them are expired and old so those aren't that important, and others, I'll give it a try with attachments.
Thanks a lot ! :)
@BlackDex commented on GitHub:
@htunlogic see: #2937
We had no limit before. But if that causes the clients to crash or whatever, then it's not worth having an extended limit.
Also, if we would make it larger, people can't switch to Bitwarden if they want to. Same as now happens for you coming from a different Password Storage.
As we want to keep as close as possible to Bitwarden, and don't want clients to break, because that would make it totally useless.
@meramsey commented on GitHub:
I wonder if we just created an attachment of the 10k+ character note in a note with same name as original one if that would be an acceptable compromise?
doesn't technically deviate from upstream just makes it easier for people to get into bitwarden format without having to delete long notes re-export and try importing again
@BlackDex commented on GitHub:
That isn't something we can do in the background. Attachments are handled differently. Also filenames are encrypted for example, not something we can (or want) to do on the server side.
Soz you can do this your self of course.