[BUG] microservices container keep restarting #481

Closed
opened 2026-02-04 20:45:01 +03:00 by OVERLORD · 23 comments
Owner

Originally created by @PeterBasista on GitHub (Dec 14, 2022).

Describe the bug
Microservices container keep restarting.

Task List

  • I have read thoroughly the README setup and installation instructions.
  • I have included my docker-compose file.
  • I have included my redacted .env file.
  • I have included information on my machine, and environment.

(I'am using default docker compose based on immich page installation guide)

To Reproduce
Steps to reproduce the behavior:

  1. Start the stack (portainer)
  2. microservices container keep restaring

Expected behavior
Microservices container keep running.

Screenshots
Snímka obrazovky 2022-12-14 o 9 32 30

Container log:

[Nest] 7  - 12/14/2022, 8:32:20 AM     LOG [NestFactory] Starting Nest application...
[Nest] 7  - 12/14/2022, 8:32:20 AM     LOG [InstanceLoader] DatabaseModule dependencies initialized +226ms
[Nest] 7  - 12/14/2022, 8:32:20 AM     LOG [InstanceLoader] TypeOrmModule dependencies initialized +1ms
[Nest] 7  - 12/14/2022, 8:32:20 AM     LOG [InstanceLoader] BullModule dependencies initialized +2ms
[Nest] 7  - 12/14/2022, 8:32:20 AM     LOG [InstanceLoader] JwtModule dependencies initialized +3ms
[Nest] 7  - 12/14/2022, 8:32:20 AM     LOG [InstanceLoader] ConfigHostModule dependencies initialized +2ms
[Nest] 7  - 12/14/2022, 8:32:20 AM     LOG [InstanceLoader] DiscoveryModule dependencies initialized +2ms
[Nest] 7  - 12/14/2022, 8:32:20 AM     LOG [InstanceLoader] ConfigModule dependencies initialized +41ms
[Nest] 7  - 12/14/2022, 8:32:20 AM     LOG [InstanceLoader] BullModule dependencies initialized +2ms
[Nest] 7  - 12/14/2022, 8:32:20 AM     LOG [InstanceLoader] BullModule dependencies initialized +3ms
[Nest] 7  - 12/14/2022, 8:32:21 AM     LOG [InstanceLoader] TypeOrmCoreModule dependencies initialized +624ms
[Nest] 7  - 12/14/2022, 8:32:21 AM     LOG [InstanceLoader] TypeOrmModule dependencies initialized +2ms
[Nest] 7  - 12/14/2022, 8:32:21 AM     LOG [InstanceLoader] TypeOrmModule dependencies initialized +2ms
[Nest] 7  - 12/14/2022, 8:32:21 AM     LOG [InstanceLoader] TypeOrmModule dependencies initialized +1ms
[Nest] 7  - 12/14/2022, 8:32:21 AM     LOG Initialising Reverse Geocoding
[Nest] 7  - 12/14/2022, 8:32:21 AM     LOG [InstanceLoader] ImmichJwtModule dependencies initialized +36ms
[Nest] 7  - 12/14/2022, 8:32:21 AM     LOG [InstanceLoader] ImmichConfigModule dependencies initialized +2ms
[Nest] 7  - 12/14/2022, 8:32:21 AM     LOG [InstanceLoader] CommunicationModule dependencies initialized +1ms
[Nest] 7  - 12/14/2022, 8:32:21 AM     LOG [InstanceLoader] MicroservicesModule dependencies initialized +2ms
/usr/src/app/node_modules/local-reverse-geocoder/index.js:746
            throw err;
            ^
CsvError: Invalid Record Length: expect 19, got 11 on line 44811
    at Object.__onRecord (/usr/src/app/node_modules/csv-parse/dist/cjs/index.cjs:933:11)
    at Object.parse (/usr/src/app/node_modules/csv-parse/dist/cjs/index.cjs:896:36)
    at Parser._flush (/usr/src/app/node_modules/csv-parse/dist/cjs/index.cjs:1325:26)
    at Parser.final [as _final] (node:internal/streams/transform:112:25)
    at callFinal (node:internal/streams/writable:694:27)
    at prefinish (node:internal/streams/writable:723:7)
    at finishMaybe (node:internal/streams/writable:733:5)
    at afterWrite (node:internal/streams/writable:504:3)
    at onwrite (node:internal/streams/writable:477:7)
    at Parser.Transform._read (node:internal/streams/transform:245:5) {
  code: 'CSV_RECORD_INCONSISTENT_FIELDS_LENGTH',
  bytes: 7569408,
  comment_lines: 0,
  empty_lines: 0,
  invalid_field_length: 0,
  lines: 44811,
  records: 44810,
  columns: false,
  error: undefined,
  header: false,
  [] index: 11,
  raw: undefined,
  column: 11,
  quoting: false,
  record: [
    '2874142', 'Malberg',
    'Malberg', '',
    '50.05',   '6.58333',
    'P',       'PPLA4',
    'DE',      '',
    '08'
  ]
}
[Nest] 7  - 12/14/2022, 8:32:44 AM     LOG [NestFactory] Starting Nest application...
[Nest] 7  - 12/14/2022, 8:32:44 AM     LOG [InstanceLoader] DatabaseModule dependencies initialized +232ms
[Nest] 7  - 12/14/2022, 8:32:44 AM     LOG [InstanceLoader] TypeOrmModule dependencies initialized +2ms
[Nest] 7  - 12/14/2022, 8:32:44 AM     LOG [InstanceLoader] BullModule dependencies initialized +1ms
[Nest] 7  - 12/14/2022, 8:32:44 AM     LOG [InstanceLoader] JwtModule dependencies initialized +3ms
[Nest] 7  - 12/14/2022, 8:32:44 AM     LOG [InstanceLoader] ConfigHostModule dependencies initialized +3ms
[Nest] 7  - 12/14/2022, 8:32:44 AM     LOG [InstanceLoader] DiscoveryModule dependencies initialized +2ms
[Nest] 7  - 12/14/2022, 8:32:44 AM     LOG [InstanceLoader] ConfigModule dependencies initialized +34ms
[Nest] 7  - 12/14/2022, 8:32:44 AM     LOG [InstanceLoader] BullModule dependencies initialized +2ms
[Nest] 7  - 12/14/2022, 8:32:44 AM     LOG [InstanceLoader] BullModule dependencies initialized +3ms
[Nest] 7  - 12/14/2022, 8:32:44 AM     LOG [InstanceLoader] TypeOrmCoreModule dependencies initialized +584ms
[Nest] 7  - 12/14/2022, 8:32:44 AM     LOG [InstanceLoader] TypeOrmModule dependencies initialized +2ms
[Nest] 7  - 12/14/2022, 8:32:44 AM     LOG [InstanceLoader] TypeOrmModule dependencies initialized +2ms
[Nest] 7  - 12/14/2022, 8:32:44 AM     LOG [InstanceLoader] TypeOrmModule dependencies initialized +2ms
[Nest] 7  - 12/14/2022, 8:32:44 AM     LOG Initialising Reverse Geocoding
[Nest] 7  - 12/14/2022, 8:32:44 AM     LOG [InstanceLoader] ImmichJwtModule dependencies initialized +57ms
[Nest] 7  - 12/14/2022, 8:32:44 AM     LOG [InstanceLoader] ImmichConfigModule dependencies initialized +2ms
[Nest] 7  - 12/14/2022, 8:32:44 AM     LOG [InstanceLoader] CommunicationModule dependencies initialized +1ms
[Nest] 7  - 12/14/2022, 8:32:44 AM     LOG [InstanceLoader] MicroservicesModule dependencies initialized +3ms
/usr/src/app/node_modules/local-reverse-geocoder/index.js:746
            throw err;
            ^
CsvError: Invalid Record Length: expect 19, got 11 on line 44811
    at Object.__onRecord (/usr/src/app/node_modules/csv-parse/dist/cjs/index.cjs:933:11)
    at Object.parse (/usr/src/app/node_modules/csv-parse/dist/cjs/index.cjs:896:36)
    at Parser._flush (/usr/src/app/node_modules/csv-parse/dist/cjs/index.cjs:1325:26)
    at Parser.final [as _final] (node:internal/streams/transform:112:25)
    at callFinal (node:internal/streams/writable:694:27)
    at prefinish (node:internal/streams/writable:723:7)
    at finishMaybe (node:internal/streams/writable:733:5)
    at afterWrite (node:internal/streams/writable:504:3)
    at onwrite (node:internal/streams/writable:477:7)
    at Parser.Transform._read (node:internal/streams/transform:245:5) {
  code: 'CSV_RECORD_INCONSISTENT_FIELDS_LENGTH',
  bytes: 7569408,
  comment_lines: 0,
  empty_lines: 0,
  invalid_field_length: 0,
  lines: 44811,
  records: 44810,
  columns: false,
  error: undefined,
  header: false,
  [] index: 11,
  raw: undefined,
  column: 11,
  quoting: false,
  record: [
    '2874142', 'Malberg',
    'Malberg', '',
    '50.05',   '6.58333',
    'P',       'PPLA4',
    'DE',      '',
    '08'
  ]
}

System

  • Phone OS [iOS, Android]: IOS
  • Server Version: 1.38.2
  • Mobile App Version: 1.38.0 Build.75
Originally created by @PeterBasista on GitHub (Dec 14, 2022). **Describe the bug** Microservices container keep restarting. **Task List** - [x] I have read thoroughly the README setup and installation instructions. - [ ] I have included my `docker-compose` file. - [ ] I have included my redacted `.env` file. - [x] I have included information on my machine, and environment. (I'am using default docker compose based on immich page installation guide) **To Reproduce** Steps to reproduce the behavior: 1. Start the stack (portainer) 2. microservices container keep restaring **Expected behavior** Microservices container keep running. **Screenshots** <img width="1087" alt="Snímka obrazovky 2022-12-14 o 9 32 30" src="https://user-images.githubusercontent.com/58752440/207545659-8834ba94-70ea-4ed0-974e-b4feb104768c.png"> Container log: ``` [Nest] 7 - 12/14/2022, 8:32:20 AM LOG [NestFactory] Starting Nest application... [Nest] 7 - 12/14/2022, 8:32:20 AM LOG [InstanceLoader] DatabaseModule dependencies initialized +226ms [Nest] 7 - 12/14/2022, 8:32:20 AM LOG [InstanceLoader] TypeOrmModule dependencies initialized +1ms [Nest] 7 - 12/14/2022, 8:32:20 AM LOG [InstanceLoader] BullModule dependencies initialized +2ms [Nest] 7 - 12/14/2022, 8:32:20 AM LOG [InstanceLoader] JwtModule dependencies initialized +3ms [Nest] 7 - 12/14/2022, 8:32:20 AM LOG [InstanceLoader] ConfigHostModule dependencies initialized +2ms [Nest] 7 - 12/14/2022, 8:32:20 AM LOG [InstanceLoader] DiscoveryModule dependencies initialized +2ms [Nest] 7 - 12/14/2022, 8:32:20 AM LOG [InstanceLoader] ConfigModule dependencies initialized +41ms [Nest] 7 - 12/14/2022, 8:32:20 AM LOG [InstanceLoader] BullModule dependencies initialized +2ms [Nest] 7 - 12/14/2022, 8:32:20 AM LOG [InstanceLoader] BullModule dependencies initialized +3ms [Nest] 7 - 12/14/2022, 8:32:21 AM LOG [InstanceLoader] TypeOrmCoreModule dependencies initialized +624ms [Nest] 7 - 12/14/2022, 8:32:21 AM LOG [InstanceLoader] TypeOrmModule dependencies initialized +2ms [Nest] 7 - 12/14/2022, 8:32:21 AM LOG [InstanceLoader] TypeOrmModule dependencies initialized +2ms [Nest] 7 - 12/14/2022, 8:32:21 AM LOG [InstanceLoader] TypeOrmModule dependencies initialized +1ms [Nest] 7 - 12/14/2022, 8:32:21 AM LOG Initialising Reverse Geocoding [Nest] 7 - 12/14/2022, 8:32:21 AM LOG [InstanceLoader] ImmichJwtModule dependencies initialized +36ms [Nest] 7 - 12/14/2022, 8:32:21 AM LOG [InstanceLoader] ImmichConfigModule dependencies initialized +2ms [Nest] 7 - 12/14/2022, 8:32:21 AM LOG [InstanceLoader] CommunicationModule dependencies initialized +1ms [Nest] 7 - 12/14/2022, 8:32:21 AM LOG [InstanceLoader] MicroservicesModule dependencies initialized +2ms /usr/src/app/node_modules/local-reverse-geocoder/index.js:746 throw err; ^ CsvError: Invalid Record Length: expect 19, got 11 on line 44811 at Object.__onRecord (/usr/src/app/node_modules/csv-parse/dist/cjs/index.cjs:933:11) at Object.parse (/usr/src/app/node_modules/csv-parse/dist/cjs/index.cjs:896:36) at Parser._flush (/usr/src/app/node_modules/csv-parse/dist/cjs/index.cjs:1325:26) at Parser.final [as _final] (node:internal/streams/transform:112:25) at callFinal (node:internal/streams/writable:694:27) at prefinish (node:internal/streams/writable:723:7) at finishMaybe (node:internal/streams/writable:733:5) at afterWrite (node:internal/streams/writable:504:3) at onwrite (node:internal/streams/writable:477:7) at Parser.Transform._read (node:internal/streams/transform:245:5) { code: 'CSV_RECORD_INCONSISTENT_FIELDS_LENGTH', bytes: 7569408, comment_lines: 0, empty_lines: 0, invalid_field_length: 0, lines: 44811, records: 44810, columns: false, error: undefined, header: false, [] index: 11, raw: undefined, column: 11, quoting: false, record: [ '2874142', 'Malberg', 'Malberg', '', '50.05', '6.58333', 'P', 'PPLA4', 'DE', '', '08' ] } [Nest] 7 - 12/14/2022, 8:32:44 AM LOG [NestFactory] Starting Nest application... [Nest] 7 - 12/14/2022, 8:32:44 AM LOG [InstanceLoader] DatabaseModule dependencies initialized +232ms [Nest] 7 - 12/14/2022, 8:32:44 AM LOG [InstanceLoader] TypeOrmModule dependencies initialized +2ms [Nest] 7 - 12/14/2022, 8:32:44 AM LOG [InstanceLoader] BullModule dependencies initialized +1ms [Nest] 7 - 12/14/2022, 8:32:44 AM LOG [InstanceLoader] JwtModule dependencies initialized +3ms [Nest] 7 - 12/14/2022, 8:32:44 AM LOG [InstanceLoader] ConfigHostModule dependencies initialized +3ms [Nest] 7 - 12/14/2022, 8:32:44 AM LOG [InstanceLoader] DiscoveryModule dependencies initialized +2ms [Nest] 7 - 12/14/2022, 8:32:44 AM LOG [InstanceLoader] ConfigModule dependencies initialized +34ms [Nest] 7 - 12/14/2022, 8:32:44 AM LOG [InstanceLoader] BullModule dependencies initialized +2ms [Nest] 7 - 12/14/2022, 8:32:44 AM LOG [InstanceLoader] BullModule dependencies initialized +3ms [Nest] 7 - 12/14/2022, 8:32:44 AM LOG [InstanceLoader] TypeOrmCoreModule dependencies initialized +584ms [Nest] 7 - 12/14/2022, 8:32:44 AM LOG [InstanceLoader] TypeOrmModule dependencies initialized +2ms [Nest] 7 - 12/14/2022, 8:32:44 AM LOG [InstanceLoader] TypeOrmModule dependencies initialized +2ms [Nest] 7 - 12/14/2022, 8:32:44 AM LOG [InstanceLoader] TypeOrmModule dependencies initialized +2ms [Nest] 7 - 12/14/2022, 8:32:44 AM LOG Initialising Reverse Geocoding [Nest] 7 - 12/14/2022, 8:32:44 AM LOG [InstanceLoader] ImmichJwtModule dependencies initialized +57ms [Nest] 7 - 12/14/2022, 8:32:44 AM LOG [InstanceLoader] ImmichConfigModule dependencies initialized +2ms [Nest] 7 - 12/14/2022, 8:32:44 AM LOG [InstanceLoader] CommunicationModule dependencies initialized +1ms [Nest] 7 - 12/14/2022, 8:32:44 AM LOG [InstanceLoader] MicroservicesModule dependencies initialized +3ms /usr/src/app/node_modules/local-reverse-geocoder/index.js:746 throw err; ^ CsvError: Invalid Record Length: expect 19, got 11 on line 44811 at Object.__onRecord (/usr/src/app/node_modules/csv-parse/dist/cjs/index.cjs:933:11) at Object.parse (/usr/src/app/node_modules/csv-parse/dist/cjs/index.cjs:896:36) at Parser._flush (/usr/src/app/node_modules/csv-parse/dist/cjs/index.cjs:1325:26) at Parser.final [as _final] (node:internal/streams/transform:112:25) at callFinal (node:internal/streams/writable:694:27) at prefinish (node:internal/streams/writable:723:7) at finishMaybe (node:internal/streams/writable:733:5) at afterWrite (node:internal/streams/writable:504:3) at onwrite (node:internal/streams/writable:477:7) at Parser.Transform._read (node:internal/streams/transform:245:5) { code: 'CSV_RECORD_INCONSISTENT_FIELDS_LENGTH', bytes: 7569408, comment_lines: 0, empty_lines: 0, invalid_field_length: 0, lines: 44811, records: 44810, columns: false, error: undefined, header: false, [] index: 11, raw: undefined, column: 11, quoting: false, record: [ '2874142', 'Malberg', 'Malberg', '', '50.05', '6.58333', 'P', 'PPLA4', 'DE', '', '08' ] } ``` **System** - Phone OS [iOS, Android]: `IOS` - Server Version: `1.38.2` - Mobile App Version: `1.38.0 Build.75`
Author
Owner

@bo0tzz commented on GitHub (Dec 14, 2022):

The geocoder has picked up some inconsistent state. That should be fixed by deleting the microservices container and then recreating it.

@bo0tzz commented on GitHub (Dec 14, 2022): The geocoder has picked up some inconsistent state. That should be fixed by deleting the microservices container and then recreating it.
Author
Owner

@PeterBasista commented on GitHub (Dec 14, 2022):

Yes, this helps for a while and it goes away, but then it happens again, I currently have 48000+ files in jobs and it crashes after about 50. I can't keep recreating the container. There must be another solution for this.

I currently spotted one more error which then trigger restart loop.

[Nest] 7  - 12/14/2022, 8:50:15 AM   ERROR Failed to generate jpeg thumbnail for asset: 94a73598-6c87-4c89-860f-6b4caeef6930
/usr/src/app/node_modules/local-reverse-geocoder/index.js:746
            throw err;
            ^
Error downloading GeoNames admin 2 codes data: Error: read ECONNRESET
(Use `node --trace-uncaught ...` to show where the exception was thrown)
@PeterBasista commented on GitHub (Dec 14, 2022): Yes, this helps for a while and it goes away, but then it happens again, I currently have 48000+ files in jobs and it crashes after about 50. I can't keep recreating the container. There must be another solution for this. I currently spotted one more error which then trigger restart loop. ``` [Nest] 7 - 12/14/2022, 8:50:15 AM ERROR Failed to generate jpeg thumbnail for asset: 94a73598-6c87-4c89-860f-6b4caeef6930 /usr/src/app/node_modules/local-reverse-geocoder/index.js:746 throw err; ^ Error downloading GeoNames admin 2 codes data: Error: read ECONNRESET (Use `node --trace-uncaught ...` to show where the exception was thrown) ```
Author
Owner

@bo0tzz commented on GitHub (Dec 14, 2022):

Can you post the full logs of the microservices container after recreating it up to the latest error you posted?

@bo0tzz commented on GitHub (Dec 14, 2022): Can you post the full logs of the microservices container after recreating it up to the latest error you posted?
Author
Owner

@PeterBasista commented on GitHub (Dec 14, 2022):

The problem is that I managed to capture a bit more probably by accident. After the container recreate, the logs of the old container are cleared and a loop starts with the microservices starting and then above (loop) log. Which means I can't get to the log of the old container. The only thing I know is that it generated a thumbnail for the video (mov) before and ended up with an error from ffmpeg that the file is invalid... But I don't remember the exact error.

@PeterBasista commented on GitHub (Dec 14, 2022): The problem is that I managed to capture a bit more probably by accident. After the container recreate, the logs of the old container are cleared and a loop starts with the microservices starting and then above (loop) log. Which means I can't get to the log of the old container. The only thing I know is that it generated a thumbnail for the video (mov) before and ended up with an error from ffmpeg that the file is invalid... But I don't remember the exact error.
Author
Owner

@PeterBasista commented on GitHub (Dec 15, 2022):

Okay, it happened to me again, I'm attaching a file with logs (entire log, 18mb and a slice, something before the first error and then a loop - 145kb) - Logs are from docker

PS: Error from object detection is fine for me, the container is turned off because I don't need it to detect objects

Line 85 - the first error that triggered a container restart loop - min.log
Line 85 -> line 59380 fulllog.log

PS: now it crashed again due to another error (log downloaded from portainer)
Line 198 - immich-immich-microservices-1_logs-3.txt

@PeterBasista commented on GitHub (Dec 15, 2022): Okay, it happened to me again, I'm attaching a file with logs (entire log, 18mb and a slice, something before the first error and then a loop - 145kb) - Logs are from docker PS: Error from object detection is fine for me, the container is turned off because I don't need it to detect objects Line 85 - the first error that triggered a container restart loop - [min.log](https://github.com/immich-app/immich/files/10235505/4c7d9ec16a4a9db32802bc2342997a5a82224916e5f31b4f9a2d6a03e84210d8-json.min.log) Line 85 -> line 59380 [fulllog.log](https://github.com/immich-app/immich/files/10235506/4c7d9ec16a4a9db32802bc2342997a5a82224916e5f31b4f9a2d6a03e84210d8-json.log) PS: now it crashed again due to another error (log downloaded from portainer) Line 198 - [immich-immich-microservices-1_logs-3.txt](https://github.com/immich-app/immich/files/10235844/_immich-immich-microservices-1_logs-3.txt)
Author
Owner

@bo0tzz commented on GitHub (Dec 15, 2022):

When the microservices container starts, it tries to download the geocoder data in the background. That process is failing - the network connection is timing out or aborting. From inside the microservices container, can you test whether it can reach the geodata URL with wget https://download.geonames.org/export/dump/?

@bo0tzz commented on GitHub (Dec 15, 2022): When the microservices container starts, it tries to download the geocoder data in the background. That process is failing - the network connection is timing out or aborting. From inside the microservices container, can you test whether it can reach the geodata URL with `wget https://download.geonames.org/export/dump/`?
Author
Owner

@PeterBasista commented on GitHub (Dec 15, 2022):

Here is the result
Snímka obrazovky 2022-12-15 o 12 31 46

@PeterBasista commented on GitHub (Dec 15, 2022): Here is the result <img width="1080" alt="Snímka obrazovky 2022-12-15 o 12 31 46" src="https://user-images.githubusercontent.com/58752440/207848934-e08ccd97-4cff-4c37-aa94-2d3c8d0af1e9.png">
Author
Owner

@alextran1502 commented on GitHub (Dec 15, 2022):

Can you include your docker-compose content?

@alextran1502 commented on GitHub (Dec 15, 2022): Can you include your `docker-compose` content?
Author
Owner

@PeterBasista commented on GitHub (Dec 15, 2022):

version: "3.8"

services:
  immich-server:
    image: altran1502/immich-server:release
    entrypoint: ["/bin/sh", "./start-server.sh"]
    volumes:
      - ${UPLOAD_LOCATION}:/usr/src/app/upload
    env_file:
      - stack.env
    environment:
      - NODE_ENV=production
    depends_on:
      - redis
      - database
    restart: always

  immich-microservices:
    image: altran1502/immich-server:release
    entrypoint: ["/bin/sh", "./start-microservices.sh"]
    volumes:
      - ${UPLOAD_LOCATION}:/usr/src/app/upload
    env_file:
      - stack.env
    environment:
      - NODE_ENV=production
    depends_on:
      - redis
      - database
    restart: always

  immich-machine-learning:
    image: altran1502/immich-machine-learning:release
    entrypoint: ["/bin/sh", "./entrypoint.sh"]
    volumes:
      - ${UPLOAD_LOCATION}:/usr/src/app/upload
    env_file:
      - stack.env
    environment:
      - NODE_ENV=production
    depends_on:
      - database
    profiles:
      - donotstart
    restart: always

  immich-web:
    image: altran1502/immich-web:release
    entrypoint: ["/bin/sh", "./entrypoint.sh"]
    env_file:
      - stack.env
    environment:
      # Rename these values for svelte public interface
      - PUBLIC_IMMICH_SERVER_URL=${IMMICH_SERVER_URL}
    restart: always

  redis:
    container_name: immich_redis
    image: redis:6.2
    restart: always

  database:
    container_name: immich_postgres
    image: postgres:14
    env_file:
      - stack.env
    environment:
      POSTGRES_PASSWORD: ${DB_PASSWORD}
      POSTGRES_USER: ${DB_USERNAME}
      POSTGRES_DB: ${DB_DATABASE_NAME}
      PG_DATA: /var/lib/postgresql/data
    volumes:
      - pgdata:/var/lib/postgresql/data
    ports:
      - "5432:5432"
    restart: always

  immich-proxy:
    container_name: immich_proxy
    image: altran1502/immich-proxy:release
    environment:
      # Make sure these values get passed through from the env file
      - IMMICH_SERVER_URL
      - IMMICH_WEB_URL
    ports:
      - 2283:8080
    logging:
      driver: none
    depends_on:
      - immich-server
    restart: always

volumes:
  pgdata:
@PeterBasista commented on GitHub (Dec 15, 2022): ``` version: "3.8" services: immich-server: image: altran1502/immich-server:release entrypoint: ["/bin/sh", "./start-server.sh"] volumes: - ${UPLOAD_LOCATION}:/usr/src/app/upload env_file: - stack.env environment: - NODE_ENV=production depends_on: - redis - database restart: always immich-microservices: image: altran1502/immich-server:release entrypoint: ["/bin/sh", "./start-microservices.sh"] volumes: - ${UPLOAD_LOCATION}:/usr/src/app/upload env_file: - stack.env environment: - NODE_ENV=production depends_on: - redis - database restart: always immich-machine-learning: image: altran1502/immich-machine-learning:release entrypoint: ["/bin/sh", "./entrypoint.sh"] volumes: - ${UPLOAD_LOCATION}:/usr/src/app/upload env_file: - stack.env environment: - NODE_ENV=production depends_on: - database profiles: - donotstart restart: always immich-web: image: altran1502/immich-web:release entrypoint: ["/bin/sh", "./entrypoint.sh"] env_file: - stack.env environment: # Rename these values for svelte public interface - PUBLIC_IMMICH_SERVER_URL=${IMMICH_SERVER_URL} restart: always redis: container_name: immich_redis image: redis:6.2 restart: always database: container_name: immich_postgres image: postgres:14 env_file: - stack.env environment: POSTGRES_PASSWORD: ${DB_PASSWORD} POSTGRES_USER: ${DB_USERNAME} POSTGRES_DB: ${DB_DATABASE_NAME} PG_DATA: /var/lib/postgresql/data volumes: - pgdata:/var/lib/postgresql/data ports: - "5432:5432" restart: always immich-proxy: container_name: immich_proxy image: altran1502/immich-proxy:release environment: # Make sure these values get passed through from the env file - IMMICH_SERVER_URL - IMMICH_WEB_URL ports: - 2283:8080 logging: driver: none depends_on: - immich-server restart: always volumes: pgdata: ```
Author
Owner

@alextran1502 commented on GitHub (Dec 15, 2022):

Can you also include redacted .env file?

@alextran1502 commented on GitHub (Dec 15, 2022): Can you also include redacted `.env` file?
Author
Owner

@PeterBasista commented on GitHub (Dec 15, 2022):

PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
HOSTNAME=f707219a0c04
REDIS_HOSTNAME=immich_redis
DB_PASSWORD=postgres
JWT_SECRET=3f10mUCN84bqIufAmZEMtxIFbe06XF6Gn7e6C1i8fKS8n7jT+z7tHCF1ALFeH5nAiZJ+5DFyvYjIlpKqSTImgQogGpo29M9sRJ3dIjmnxhn76wHXg7xzOhJRaUb+3gd4cj4ZRYjzj1jhYzRJmcz1US8lhbkpbNmbLn9/SIxuXRg=
DB_USERNAME=postgres
LOG_LEVEL=simple
DB_HOSTNAME=immich_postgres
NODE_ENV=production
UPLOAD_LOCATION=/srv/dev-disk-by-uuid-8d12abc7-14c7-49d5-9d0e-ed350f5f74eb/immich-photos
PUBLIC_LOGIN_PAGE_MESSAGE=
DB_DATABASE_NAME=immich
NODE_VERSION=16.15.1
YARN_VERSION=1.22.19
HOME=/root
@PeterBasista commented on GitHub (Dec 15, 2022): ``` PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin HOSTNAME=f707219a0c04 REDIS_HOSTNAME=immich_redis DB_PASSWORD=postgres JWT_SECRET=3f10mUCN84bqIufAmZEMtxIFbe06XF6Gn7e6C1i8fKS8n7jT+z7tHCF1ALFeH5nAiZJ+5DFyvYjIlpKqSTImgQogGpo29M9sRJ3dIjmnxhn76wHXg7xzOhJRaUb+3gd4cj4ZRYjzj1jhYzRJmcz1US8lhbkpbNmbLn9/SIxuXRg= DB_USERNAME=postgres LOG_LEVEL=simple DB_HOSTNAME=immich_postgres NODE_ENV=production UPLOAD_LOCATION=/srv/dev-disk-by-uuid-8d12abc7-14c7-49d5-9d0e-ed350f5f74eb/immich-photos PUBLIC_LOGIN_PAGE_MESSAGE= DB_DATABASE_NAME=immich NODE_VERSION=16.15.1 YARN_VERSION=1.22.19 HOME=/root ```
Author
Owner

@alextran1502 commented on GitHub (Dec 15, 2022):

What is the purpose of the PATH, HOSTNAME, NODE_VERSION, YARN_VERSION and HOME variables in this stack?

@alextran1502 commented on GitHub (Dec 15, 2022): What is the purpose of the `PATH`, `HOSTNAME`, `NODE_VERSION`, `YARN_VERSION` and `HOME` variables in this stack?
Author
Owner

@PeterBasista commented on GitHub (Dec 15, 2022):

This was dump from docker exec f707219a0c04 /usr/bin/env, mine env defined in portainer stack is

DB_HOSTNAME=immich_postgres
DB_USERNAME=postgres
DB_PASSWORD=postgres
DB_DATABASE_NAME=immich
REDIS_HOSTNAME=immich_redis
UPLOAD_LOCATION=/srv/dev-disk-by-uuid-8d12abc7-14c7-49d5-9d0e-ed350f5f74eb/immich-photos
LOG_LEVEL=simple
JWT_SECRET=3f10mUCN84bqIufAmZEMtxIFbe06XF6Gn7e6C1i8fKS8n7jT+z7tHCF1ALFeH5nAiZJ+5DFyvYjIlpKqSTImgQogGpo29M9sRJ3dIjmnxhn76wHXg7xzOhJRaUb+3gd4cj4ZRYjzj1jhYzRJmcz1US8lhbkpbNmbLn9/SIxuXRg=
PUBLIC_LOGIN_PAGE_MESSAGE=
@PeterBasista commented on GitHub (Dec 15, 2022): This was dump from `docker exec f707219a0c04 /usr/bin/env`, mine env defined in portainer stack is ``` DB_HOSTNAME=immich_postgres DB_USERNAME=postgres DB_PASSWORD=postgres DB_DATABASE_NAME=immich REDIS_HOSTNAME=immich_redis UPLOAD_LOCATION=/srv/dev-disk-by-uuid-8d12abc7-14c7-49d5-9d0e-ed350f5f74eb/immich-photos LOG_LEVEL=simple JWT_SECRET=3f10mUCN84bqIufAmZEMtxIFbe06XF6Gn7e6C1i8fKS8n7jT+z7tHCF1ALFeH5nAiZJ+5DFyvYjIlpKqSTImgQogGpo29M9sRJ3dIjmnxhn76wHXg7xzOhJRaUb+3gd4cj4ZRYjzj1jhYzRJmcz1US8lhbkpbNmbLn9/SIxuXRg= PUBLIC_LOGIN_PAGE_MESSAGE= ```
Author
Owner

@alextran1502 commented on GitHub (Dec 15, 2022):

Thank you, from the logs file, it looks like you have some network issue in the stack so that the container cannot communicate to each other as well as to download the required file for the Geocoding.

Can you remove the stack and create a new instance? I suspect something might have happened during the creation process for this stack.

@alextran1502 commented on GitHub (Dec 15, 2022): Thank you, from the logs file, it looks like you have some network issue in the stack so that the container cannot communicate to each other as well as to download the required file for the Geocoding. Can you remove the stack and create a new instance? I suspect something might have happened during the creation process for this stack.
Author
Owner

@jrasm91 commented on GitHub (Dec 15, 2022):

Would you mind adding the output of the command tree in the .reverse-geocoding-dump folder on the container?

@jrasm91 commented on GitHub (Dec 15, 2022): Would you mind adding the output of the command `tree` in the `.reverse-geocoding-dump` folder on the container?
Author
Owner

@jrasm91 commented on GitHub (Dec 16, 2022):

Based on the logs, it looks like the download for these files is being terminated prematurely, leading to a partially downloaded file, which would explain the CSV errors. As far as I can tell, these files get downloaded each day from: https://download.geonames.org/export/dump/ and renamed to match the current date, but don't have any sort of integrity checks, so eventually you end up with a corrupted file and all the lookups start failing with a csv parsing error (hence the consistent restarts).

One option you have is to manually download/rename the files and put them in the .reverse-geocoding-dump folder yourself (via a mount probably), restart the container and see if that helps. The cities500 file seems to be the biggest one. The folder structure should look like this:

├── admin1_codes
│   └── admin1CodesASCII_2022-12-15.txt
├── admin2_codes
│   └── admin2Codes_2022-12-15.txt
└── cities500
    └── cities500_2022-12-15.txt

Not sure, outside of a network error, what would cause the files to download incorrectly. I know that the library downloads the file in start-up, but it also has to download new ones when it's a new (UTC) day, so there might be a bug related to that part of the process. Do these errors by chance happen around the same time? Potentially some short time after midnight UTC?

@jrasm91 commented on GitHub (Dec 16, 2022): Based on the logs, it looks like the download for these files is being terminated prematurely, leading to a partially downloaded file, which would explain the CSV errors. As far as I can tell, these files get downloaded each day from: https://download.geonames.org/export/dump/ and renamed to match the current date, but don't have any sort of integrity checks, so eventually you end up with a corrupted file and all the lookups start failing with a csv parsing error (hence the consistent restarts). One option you have is to manually download/rename the files and put them in the `.reverse-geocoding-dump` folder yourself (via a mount probably), restart the container and see if that helps. The cities500 file seems to be the biggest one. The folder structure should look like this: ``` ├── admin1_codes │ └── admin1CodesASCII_2022-12-15.txt ├── admin2_codes │ └── admin2Codes_2022-12-15.txt └── cities500 └── cities500_2022-12-15.txt ``` Not sure, outside of a network error, what would cause the files to download incorrectly. I know that the library downloads the file in start-up, but it also has to download new ones when it's a new (UTC) day, so there might be a bug related to that part of the process. Do these errors by chance happen around the same time? Potentially some short time after midnight UTC?
Author
Owner

@PeterBasista commented on GitHub (Dec 16, 2022):

Thank you guys, currently all the assets are done, so it's quite hard to see if the problem persists. I've re-deployed the whole stack and so far everything is fine, I've also looked in the container and there are files from today.

@PeterBasista commented on GitHub (Dec 16, 2022): Thank you guys, currently all the assets are done, so it's quite hard to see if the problem persists. I've re-deployed the whole stack and so far everything is fine, I've also looked in the container and there are files from today.
Author
Owner

@alextran1502 commented on GitHub (Dec 22, 2022):

Please reopen the issue if the problem arise

@alextran1502 commented on GitHub (Dec 22, 2022): Please reopen the issue if the problem arise
Author
Owner

@twitsforbrains commented on GitHub (Mar 7, 2023):

Not sure if this helps, but I mount the reverse geocode folder on my server. I just started the server 7 hours ago, March 6th 2023 at ~20:00 UTC. I don't know when it started failing but I believe it was during the turnover to UTC as right now it is 3am March 7th UTC. My server is UTC, but my local computer is not (not sure if that matters). In the folders I saw some march 6th and some march 7th files. When I went to https://download.geonames.org/export/dump/ as recommended above I noticed that there are actually only files for March 6th available to download. The march 7th files are definitely not full files, (the admin1 is only a few bytes) The march 8th city one is 31Mb but the March 7th one shows only 2mb.

Looking at logs on the immich-microservices I see a connection terminated error but its not clear what caused that. My current guess is something kills the microservice which then doesn't fully download the file. On the next restart, my mounted folder makes the microservice not download a full file and I'm stuck with partial files until I delete it.

Probably on start up we want to always download these files if they are that important? Alternatively download to a different folder and add a symlink to it so that it either succeeds and exists or the symlink fails and the file doesnt exist? I haven't thought it through much...

@twitsforbrains commented on GitHub (Mar 7, 2023): Not sure if this helps, but I mount the reverse geocode folder on my server. I just started the server 7 hours ago, March 6th 2023 at ~20:00 UTC. I don't know when it started failing but I believe it was during the turnover to UTC as right now it is 3am March 7th UTC. My server is UTC, but my local computer is not (not sure if that matters). In the folders I saw some march 6th and some march 7th files. When I went to https://download.geonames.org/export/dump/ as recommended above I noticed that there are actually only files for March 6th available to download. The march 7th files are definitely not full files, (the admin1 is only a few bytes) The march 8th city one is 31Mb but the March 7th one shows only 2mb. Looking at logs on the immich-microservices I see a connection terminated error but its not clear what caused that. My current guess is something kills the microservice which then doesn't fully download the file. On the next restart, my mounted folder makes the microservice not download a full file and I'm stuck with partial files until I delete it. Probably on start up we want to always download these files if they are that important? Alternatively download to a different folder and add a symlink to it so that it either succeeds and exists or the symlink fails and the file doesnt exist? I haven't thought it through much...
Author
Owner

@jrasm91 commented on GitHub (Mar 7, 2023):

Yeah, the issue is all of that logic and implementation is handled by another library, which does not handle this very well.

https://github.com/tomayac/local-reverse-geocoder/issues/63

@jrasm91 commented on GitHub (Mar 7, 2023): Yeah, the issue is all of that logic and implementation is handled by another library, which does not handle this very well. https://github.com/tomayac/local-reverse-geocoder/issues/63
Author
Owner

@bo0tzz commented on GitHub (Mar 7, 2023):

@jrasm91 as a workaround, would it work if we catch the failure and just delete the cache folder in the error handler?

@bo0tzz commented on GitHub (Mar 7, 2023): @jrasm91 as a workaround, would it work if we catch the failure and just delete the cache folder in the error handler?
Author
Owner

@jrasm91 commented on GitHub (Mar 7, 2023):

Good idea - it is probably easy enough to detect this specific error and then try it again automatically after deleting the corrupted download.

@jrasm91 commented on GitHub (Mar 7, 2023): Good idea - it is probably easy enough to detect this specific error and then try it again automatically after deleting the corrupted download.
Author
Owner

@jrasm91 commented on GitHub (Mar 7, 2023):

Created #1963 to track the root issue.

@jrasm91 commented on GitHub (Mar 7, 2023): Created #1963 to track the root issue.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: immich-app/immich#481