Importing silently fails without UI error for 10k character+ secure notes #963

Closed
opened 2025-10-09 16:58:33 +03:00 by OVERLORD · 11 comments
Owner

Originally created by @meramsey on GitHub.

Originally assigned to: @BlackDex on GitHub.

Subject of the issue

Import errors are not handled the same as upstream for failing things. Which is confusing for people not tech savvy..

Your environment (Generated via diagnostics page)

  • Vaultwarden version: v1.27.0
  • Web-vault version: v2022.12.0
  • Running within Docker: true (Base: Debian)
  • Environment settings overridden: true
  • Uses a reverse proxy: false
  • Internet access: true
  • Internet access via a proxy: false
  • DNS Check: true
  • Time Check: true
  • Domain Configuration Check: false
  • HTTPS Check: true
  • Database type: SQLite
  • Database version: 3.39.2
  • Clients used:
  • Reverse proxy and version:
  • Other relevant information:

Config (Generated via diagnostics page)

Show Running Config

Environment settings which are overridden: SIGNUPS_ALLOWED, INVITATIONS_ALLOWED, ADMIN_TOKEN

{
  "_duo_akey": null,
  "_enable_duo": false,
  "_enable_email_2fa": false,
  "_enable_smtp": true,
  "_enable_yubico": true,
  "_icon_service_csp": "",
  "_icon_service_url": "",
  "_ip_header_enabled": true,
  "_smtp_img_src": "cid:",
  "admin_ratelimit_max_burst": 3,
  "admin_ratelimit_seconds": 300,
  "admin_token": "***",
  "allowed_iframe_ancestors": "",
  "attachments_folder": "data/attachments",
  "authenticator_disable_time_drift": false,
  "data_folder": "data",
  "database_conn_init": "",
  "database_max_conns": 10,
  "database_timeout": 30,
  "database_url": "***************",
  "db_connection_retries": 15,
  "disable_2fa_remember": false,
  "disable_admin_token": false,
  "disable_icon_download": false,
  "domain": "*****://*******************",
  "domain_origin": "*****://*******************",
  "domain_path": "",
  "domain_set": true,
  "duo_host": null,
  "duo_ikey": null,
  "duo_skey": null,
  "email_attempts_limit": 3,
  "email_expiration_time": 600,
  "email_token_size": 6,
  "emergency_access_allowed": true,
  "emergency_notification_reminder_schedule": "0 3 * * * *",
  "emergency_request_timeout_schedule": "0 7 * * * *",
  "enable_db_wal": true,
  "event_cleanup_schedule": "0 10 0 * * *",
  "events_days_retain": null,
  "extended_logging": true,
  "helo_name": null,
  "hibp_api_key": null,
  "icon_blacklist_non_global_ips": true,
  "icon_blacklist_regex": null,
  "icon_cache_folder": "data/icon_cache",
  "icon_cache_negttl": 259200,
  "icon_cache_ttl": 2592000,
  "icon_download_timeout": 10,
  "icon_redirect_code": 302,
  "icon_service": "internal",
  "incomplete_2fa_schedule": "30 * * * * *",
  "incomplete_2fa_time_limit": 3,
  "invitation_expiration_hours": 120,
  "invitation_org_name": "Vaultwarden",
  "invitations_allowed": true,
  "ip_header": "X-Real-IP",
  "job_poll_interval_ms": 30000,
  "log_file": null,
  "log_level": "Info",
  "log_timestamp_format": "%Y-%m-%d %H:%M:%S.%3f",
  "login_ratelimit_max_burst": 10,
  "login_ratelimit_seconds": 60,
  "org_attachment_limit": null,
  "org_creation_users": "",
  "org_events_enabled": false,
  "org_groups_enabled": false,
  "password_hints_allowed": true,
  "password_iterations": 100000,
  "reload_templates": false,
  "require_device_email": false,
  "rsa_key_filename": "data/rsa_key",
  "send_purge_schedule": "0 5 * * * *",
  "sends_allowed": true,
  "sends_folder": "data/sends",
  "show_password_hint": false,
  "signups_allowed": true,
  "signups_domains_whitelist": "*************",
  "signups_verify": false,
  "signups_verify_resend_limit": 6,
  "signups_verify_resend_time": 3600,
  "smtp_accept_invalid_certs": false,
  "smtp_accept_invalid_hostnames": false,
  "smtp_auth_mechanism": null,
  "smtp_debug": false,
  "smtp_embed_images": true,
  "smtp_explicit_tls": null,
  "smtp_from": "********************",
  "smtp_from_name": "Vaultwarden",
  "smtp_host": "********************",
  "smtp_password": "***",
  "smtp_port": 587,
  "smtp_security": "starttls",
  "smtp_ssl": null,
  "smtp_timeout": 15,
  "smtp_username": "********************",
  "templates_folder": "data/templates",
  "tmp_folder": "data/tmp",
  "trash_auto_delete_days": null,
  "trash_purge_schedule": "0 5 0 * * *",
  "use_syslog": false,
  "user_attachment_limit": null,
  "web_vault_enabled": true,
  "web_vault_folder": "web-vault/",
  "websocket_address": "0.0.0.0",
  "websocket_enabled": true,
  "websocket_port": 3012,
  "yubico_client_id": "82595",
  "yubico_secret_key": "***",
  "yubico_server": null
}
  • Vaultwarden version: v1.27.0
  • Install method: docker

Steps to reproduce

Export a 1Password account with 10,000character+ secure note as 1pux file... or use attached.
Import 1password 1pux
browse to file
click import starts loading then shows console error

Example from attached file
https://user-images.githubusercontent.com/1596188/209586637-c2158710-9e9d-487a-8808-f25370fbc611.mp4

Test 1pux file to test with. Just will need to rename to remove the .zip as github doesn't allow 1pux
1PasswordExport-4WNLQ6BLP5HXFOIOEC62DSEZPQ-20221226-170653.1pux.zip

Expected behaviour

For it to show error like it does on main bitwarden where it shows errors per bad note with the name so you know what to fix before exporting again.

Example:
Screenshot from 2022-12-26 17-07-29

Import error
There was a problem with the data you tried to import. Please resolve the errors listed below in your source file and try again.
[1] [SecureNote] "Lorem ipsum 100 paragraphs large note": The field Notes exceeds the maximum encrypted value length of 10000 characters.

Error is Related to the limit upstream just not handled the same in the UI here:
https://www.reddit.com/r/Bitwarden/comments/sklxy5/why_limit_secure_notes_to_10000_characters/
https://community.bitwarden.com/t/support-longer-notes-breaks-lastpass-import/2970

Actual behaviour

Import looks like its working, but stops and fails when it gets to long note and does not say any error in main UI.

image

Response raw

XHRPOST
https://redacted/api/ciphers/import
[HTTP/2 400 Bad Request 538ms]
1
{"ErrorModel":{"Message":"The field Notes exceeds the maximum encrypted value length of 10000 characters.","Object":"error"},"ExceptionMessage":null,"ExceptionStackTrace":null,"InnerExceptionMessage":null,"Message":"The field Notes exceeds the maximum encrypted value length of 10000 characters.","Object":"error","ValidationErrors":{"":["The field Notes exceeds the maximum encrypted value length of 10000 characters."]},"error":"","error_description":""}

Troubleshooting data

Originally created by @meramsey on GitHub. Originally assigned to: @BlackDex on GitHub. <!-- # ### NOTE: Please update to the latest version of vaultwarden before reporting an issue! This saves you and us a lot of time and troubleshooting. See: * https://github.com/dani-garcia/vaultwarden/issues/1180 * https://github.com/dani-garcia/vaultwarden/wiki/Updating-the-vaultwarden-image # ### --> <!-- Please fill out the following template to make solving your problem easier and faster for us. This is only a guideline. If you think that parts are unnecessary for your issue, feel free to remove them. Remember to hide/redact personal or confidential information, such as passwords, IP addresses, and DNS names as appropriate. --> ### Subject of the issue <!-- Describe your issue here. --> Import errors are not handled the same as upstream for failing things. Which is confusing for people not tech savvy.. ### Your environment (Generated via diagnostics page) * Vaultwarden version: v1.27.0 * Web-vault version: v2022.12.0 * Running within Docker: true (Base: Debian) * Environment settings overridden: true * Uses a reverse proxy: false * Internet access: true * Internet access via a proxy: false * DNS Check: true * Time Check: true * Domain Configuration Check: false * HTTPS Check: true * Database type: SQLite * Database version: 3.39.2 * Clients used: * Reverse proxy and version: * Other relevant information: ### Config (Generated via diagnostics page) <details><summary>Show Running Config</summary> **Environment settings which are overridden:** SIGNUPS_ALLOWED, INVITATIONS_ALLOWED, ADMIN_TOKEN ```json { "_duo_akey": null, "_enable_duo": false, "_enable_email_2fa": false, "_enable_smtp": true, "_enable_yubico": true, "_icon_service_csp": "", "_icon_service_url": "", "_ip_header_enabled": true, "_smtp_img_src": "cid:", "admin_ratelimit_max_burst": 3, "admin_ratelimit_seconds": 300, "admin_token": "***", "allowed_iframe_ancestors": "", "attachments_folder": "data/attachments", "authenticator_disable_time_drift": false, "data_folder": "data", "database_conn_init": "", "database_max_conns": 10, "database_timeout": 30, "database_url": "***************", "db_connection_retries": 15, "disable_2fa_remember": false, "disable_admin_token": false, "disable_icon_download": false, "domain": "*****://*******************", "domain_origin": "*****://*******************", "domain_path": "", "domain_set": true, "duo_host": null, "duo_ikey": null, "duo_skey": null, "email_attempts_limit": 3, "email_expiration_time": 600, "email_token_size": 6, "emergency_access_allowed": true, "emergency_notification_reminder_schedule": "0 3 * * * *", "emergency_request_timeout_schedule": "0 7 * * * *", "enable_db_wal": true, "event_cleanup_schedule": "0 10 0 * * *", "events_days_retain": null, "extended_logging": true, "helo_name": null, "hibp_api_key": null, "icon_blacklist_non_global_ips": true, "icon_blacklist_regex": null, "icon_cache_folder": "data/icon_cache", "icon_cache_negttl": 259200, "icon_cache_ttl": 2592000, "icon_download_timeout": 10, "icon_redirect_code": 302, "icon_service": "internal", "incomplete_2fa_schedule": "30 * * * * *", "incomplete_2fa_time_limit": 3, "invitation_expiration_hours": 120, "invitation_org_name": "Vaultwarden", "invitations_allowed": true, "ip_header": "X-Real-IP", "job_poll_interval_ms": 30000, "log_file": null, "log_level": "Info", "log_timestamp_format": "%Y-%m-%d %H:%M:%S.%3f", "login_ratelimit_max_burst": 10, "login_ratelimit_seconds": 60, "org_attachment_limit": null, "org_creation_users": "", "org_events_enabled": false, "org_groups_enabled": false, "password_hints_allowed": true, "password_iterations": 100000, "reload_templates": false, "require_device_email": false, "rsa_key_filename": "data/rsa_key", "send_purge_schedule": "0 5 * * * *", "sends_allowed": true, "sends_folder": "data/sends", "show_password_hint": false, "signups_allowed": true, "signups_domains_whitelist": "*************", "signups_verify": false, "signups_verify_resend_limit": 6, "signups_verify_resend_time": 3600, "smtp_accept_invalid_certs": false, "smtp_accept_invalid_hostnames": false, "smtp_auth_mechanism": null, "smtp_debug": false, "smtp_embed_images": true, "smtp_explicit_tls": null, "smtp_from": "********************", "smtp_from_name": "Vaultwarden", "smtp_host": "********************", "smtp_password": "***", "smtp_port": 587, "smtp_security": "starttls", "smtp_ssl": null, "smtp_timeout": 15, "smtp_username": "********************", "templates_folder": "data/templates", "tmp_folder": "data/tmp", "trash_auto_delete_days": null, "trash_purge_schedule": "0 5 0 * * *", "use_syslog": false, "user_attachment_limit": null, "web_vault_enabled": true, "web_vault_folder": "web-vault/", "websocket_address": "0.0.0.0", "websocket_enabled": true, "websocket_port": 3012, "yubico_client_id": "82595", "yubico_secret_key": "***", "yubico_server": null } ``` </details> <!-- ========================================================================================= Preferably, use the `Generate Support String` button on the admin page's Diagnostics tab. That will auto-generate most of the info requested in this section. ========================================================================================= --> <!-- The version number, obtained from the logs (at startup) or the admin diagnostics page --> <!-- This is NOT the version number shown on the web vault, which is versioned separately from vaultwarden --> <!-- Remember to check if your issue exists on the latest version first! --> * Vaultwarden version: v1.27.0 <!-- How the server was installed: Docker image, OS package, built from source, etc. --> * Install method: docker ### Steps to reproduce <!-- Tell us how to reproduce this issue. What parameters did you set (differently from the defaults) and how did you start vaultwarden? --> Export a 1Password account with 10,000character+ secure note as 1pux file... or use attached. Import 1password 1pux browse to file click import starts loading then shows console error Example from attached file https://user-images.githubusercontent.com/1596188/209586637-c2158710-9e9d-487a-8808-f25370fbc611.mp4 Test 1pux file to test with. Just will need to rename to remove the .zip as github doesn't allow 1pux [1PasswordExport-4WNLQ6BLP5HXFOIOEC62DSEZPQ-20221226-170653.1pux.zip](https://github.com/dani-garcia/vaultwarden/files/10304924/1PasswordExport-4WNLQ6BLP5HXFOIOEC62DSEZPQ-20221226-170653.1pux.zip) ### Expected behaviour <!-- Tell us what you expected to happen --> For it to show error like it does on main bitwarden where it shows errors per bad note with the name so you know what to fix before exporting again. Example: ![Screenshot from 2022-12-26 17-07-29](https://user-images.githubusercontent.com/1596188/209586672-c855ed05-a4b0-405f-82ad-24f1ab4ba258.png) Import error There was a problem with the data you tried to import. Please resolve the errors listed below in your source file and try again. [1] [SecureNote] "Lorem ipsum 100 paragraphs large note": The field Notes exceeds the maximum encrypted value length of 10000 characters. Error is Related to the limit upstream just not handled the same in the UI here: https://www.reddit.com/r/Bitwarden/comments/sklxy5/why_limit_secure_notes_to_10000_characters/ https://community.bitwarden.com/t/support-longer-notes-breaks-lastpass-import/2970 ### Actual behaviour <!-- Tell us what actually happened --> Import looks like its working, but stops and fails when it gets to long note and does not say any error in main UI. ![image](https://user-images.githubusercontent.com/1596188/209586354-601c0729-daf3-4e54-a798-909fea693ead.png) Response raw ``` XHRPOST https://redacted/api/ciphers/import [HTTP/2 400 Bad Request 538ms] 1 {"ErrorModel":{"Message":"The field Notes exceeds the maximum encrypted value length of 10000 characters.","Object":"error"},"ExceptionMessage":null,"ExceptionStackTrace":null,"InnerExceptionMessage":null,"Message":"The field Notes exceeds the maximum encrypted value length of 10000 characters.","Object":"error","ValidationErrors":{"":["The field Notes exceeds the maximum encrypted value length of 10000 characters."]},"error":"","error_description":""} ``` ### Troubleshooting data <!-- Share any log files, screenshots, or other relevant troubleshooting data -->
OVERLORD added the bugenhancement labels 2025-10-09 16:58:33 +03:00
Author
Owner

@dlehammer commented on GitHub:

As a new user, also affected by this symptom I would suggest the following:

  1. The lowest hanging fruit seems to be adding a error popover, alerting the user to the error from the backend ~ similar to the Bitwarden example.
  2. I was surprised that a partial import had been performed. In-order to address this issue I would suggest breaking the import into separate stages ala:
    A. Validation of file, this would allow for useful feedback to the user ~ as to which specific entries are blocking import and why.
    B. When validation passes, the import is performed in a single transaction.

🤓

@dlehammer commented on GitHub: As a new user, also affected by this symptom I would suggest the following: 1. The lowest hanging fruit seems to be adding a error popover, alerting the user to the error from the backend ~ similar to the Bitwarden example. 2. I was surprised that a partial import had been performed. In-order to address this issue I would suggest breaking the import into separate stages ala: A. Validation of file, this would allow for useful feedback to the user ~ as to which specific entries are blocking import and why. B. When validation passes, the import is performed in a single transaction. 🤓
Author
Owner

@dlehammer commented on GitHub:

Just discovered another surprise; re-import of the same file, generates duplicate folders and items, even though the import file contains identical data to already existing items.

Any tips on bulk cleaning up duplicates ?

@dlehammer commented on GitHub: Just discovered another surprise; re-import of the same file, generates duplicate folders and items, even though the import file contains identical data to already existing items. Any tips on bulk cleaning up duplicates **?**
Author
Owner

@cksapp commented on GitHub:

Another similar previous discussion related to import of notes exceeding the maximum allowed value.

https://github.com/dani-garcia/vaultwarden/discussions/2176
https://github.com/dani-garcia/vaultwarden/issues/2937

I'm sure imposing the limit shouldn't be too difficult, but I wonder if there is any elegant way to "convert" any possible old notes that currently would be over the upstream value should Vaultwarden also honor the 10k encrypted limit to minimize breaking thing as much as possible.

@cksapp commented on GitHub: Another similar previous discussion related to import of notes exceeding the maximum allowed value. https://github.com/dani-garcia/vaultwarden/discussions/2176 https://github.com/dani-garcia/vaultwarden/issues/2937 I'm sure imposing the limit shouldn't be too difficult, but I wonder if there is any elegant way to "convert" any possible old notes that currently would be over the upstream value should Vaultwarden also honor the 10k encrypted limit to minimize breaking thing as much as possible.
Author
Owner

@htunlogic commented on GitHub:

Why is there 10_000 limit anyway, and can we extend it?
If i googled correctly, the limit for TEXT field is 65535 characters.

That means we can get 65535 characters without changing the database schema.

Last pass allows a lot more characters so its impossible to import notes properly from last pass.

@htunlogic commented on GitHub: Why is there 10_000 limit anyway, and can we extend it? If i googled correctly, the limit for TEXT field is 65535 characters. That means we can get 65535 characters without changing the database schema. Last pass allows a lot more characters so its impossible to import notes properly from last pass.
Author
Owner

@meramsey commented on GitHub:

Just discovered another surprise; re-import of the same file, generates duplicate items, even though the import file contains identical data to already existing items.

Any tips on bulk cleaning up duplicates ?

I just purged the vault data and reattempted. That is an upstream bug too but I think if it validates all items before actually restoring that would prevent that in first place. Not sure how feasible that is or if you can use a temp vault then nuke after it fails and then if it succeeds copy or do real import to main one.

@meramsey commented on GitHub: > Just discovered another surprise; re-import of the same file, generates duplicate items, even though the import file contains identical data to already existing items. > > Any tips on bulk cleaning up duplicates **?** I just purged the vault data and reattempted. That is an upstream bug too but I think if it validates all items before actually restoring that would prevent that in first place. Not sure how feasible that is or if you can use a temp vault then nuke after it fails and then if it succeeds copy or do real import to main one.
Author
Owner

@BlackDex commented on GitHub:

Might i suggest to use attachments in this case? You can put attachments to these items.
The only thing with that is, that they arn't synced, because you need to download them.

@BlackDex commented on GitHub: Might i suggest to use attachments in this case? You can put attachments to these items. The only thing with that is, that they arn't synced, because you need to download them.
Author
Owner

@htunlogic commented on GitHub:

@BlackDex You are correct. I haven't been really thinking about that... My only priority is to get the hell away from Lastpass :)

@htunlogic commented on GitHub: @BlackDex You are correct. I haven't been really thinking about that... My only priority is to get the hell away from Lastpass :)
Author
Owner

@htunlogic commented on GitHub:

Well, the only thing I'm having issue with really are the GPG keys I have stored in secure notes. I'll shuffle through them, some of them are expired and old so those aren't that important, and others, I'll give it a try with attachments.

Thanks a lot ! :)

@htunlogic commented on GitHub: Well, the only thing I'm having issue with really are the GPG keys I have stored in secure notes. I'll shuffle through them, some of them are expired and old so those aren't that important, and others, I'll give it a try with attachments. Thanks a lot ! :)
Author
Owner

@BlackDex commented on GitHub:

@htunlogic see: #2937
We had no limit before. But if that causes the clients to crash or whatever, then it's not worth having an extended limit.
Also, if we would make it larger, people can't switch to Bitwarden if they want to. Same as now happens for you coming from a different Password Storage.

As we want to keep as close as possible to Bitwarden, and don't want clients to break, because that would make it totally useless.

@BlackDex commented on GitHub: @htunlogic see: #2937 We had no limit before. But if that causes the clients to crash or whatever, then it's not worth having an extended limit. Also, if we would make it larger, people can't switch to Bitwarden if they want to. Same as now happens for you coming from a different Password Storage. As we want to keep as close as possible to Bitwarden, and don't want clients to break, because that would make it totally useless.
Author
Owner

@meramsey commented on GitHub:

I wonder if we just created an attachment of the 10k+ character note in a note with same name as original one if that would be an acceptable compromise?

doesn't technically deviate from upstream just makes it easier for people to get into bitwarden format without having to delete long notes re-export and try importing again

@meramsey commented on GitHub: I wonder if we just created an attachment of the 10k+ character note in a note with same name as original one if that would be an acceptable compromise? doesn't technically deviate from upstream just makes it easier for people to get into bitwarden format without having to delete long notes re-export and try importing again
Author
Owner

@BlackDex commented on GitHub:

I wonder if we just created an attachment of the 10k+ character note in a note with same name as original one if that would be an acceptable compromise?

doesn't technically deviate from upstream just makes it easier for people to get into bitwarden format without having to delete long notes re-export and try importing again

That isn't something we can do in the background. Attachments are handled differently. Also filenames are encrypted for example, not something we can (or want) to do on the server side.

Soz you can do this your self of course.

@BlackDex commented on GitHub: > I wonder if we just created an attachment of the 10k+ character note in a note with same name as original one if that would be an acceptable compromise? > > doesn't technically deviate from upstream just makes it easier for people to get into bitwarden format without having to delete long notes re-export and try importing again That isn't something we can do in the background. Attachments are handled differently. Also filenames are encrypted for example, not something we can (or want) to do on the server side. Soz you can do this your self of course.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: starred/vaultwarden#963