MongoDB Not stable - MongoNetworkError - Snap version 4.8.3 - ubuntu 20.04

Description

Can someone provide a little insight on this issue I’m struggling with since, well, very very long time now … Pleease

Since I moved to version 4.8.3 on 4.x/stable - I’m getting these strange mongodb error
It seems the system is suffering from intermittent disconnections on 127.0.0.1:27017

Below the Log from Rocketchat :

I20221107-14:46:01.092(1) Exception in setInterval callback: Error: connect ECONNREFUSED 127.0.0.1:27017     at TCPConnectWrap.afterConnect [as oncomplete] (net.js:1159:16)     at TCPConnectWrap.callbackTrampoline (internal/async_hooks.js:130:17) {   name: 'MongoNetworkError' } 
I20221107-14:47:04.088(1) Exception while invoking method 'UserPresence:away' MongoNetworkError: connection 28 to 127.0.0.1:27017 closed     at Connection.handleIssue (/snap/rocketchat-server/1523/programs/server/npm/node_modules/meteor/npm-mongo/node_modules/mongodb/lib/cmap/connection.js:129:15)     at Socket.<anonymous> (/snap/rocketchat-server/1523/programs/server/npm/node_modules/meteor/npm-mongo/node_modules/mongodb/lib/cmap/connection.js:62:35)     at Socket.emit (events.js:400:28)     at Socket.emit (domain.js:475:12)     at TCP.<anonymous> (net.js:686:12)     at TCP.callbackTrampoline (internal/async_hooks.js:130:17)
I20221107-15:07:21.129(1) Exception in setInterval callback: Error: connect ECONNREFUSED 127.0.0.1:27017     at TCPConnectWrap.afterConnect [as oncomplete] (net.js:1159:16)     at TCPConnectWrap.callbackTrampoline (internal/async_hooks.js:130:17) {   name: 'MongoNetworkError' } 
I20221107-14:56:43.789(1) Exception from sub stream-notify-room id ug8APxAqXSGBge24Q Error: connect ECONNREFUSED 127.0.0.1:27017     at TCPConnectWrap.afterConnect [as oncomplete] (net.js:1159:16)     at TCPConnectWrap.callbackTrampoline (internal/async_hooks.js:130:17)  => awaited here:     at Function.Promise.await (/snap/rocketchat-server/1523/programs/server/npm/node_modules/meteor/promise/node_modules/meteor-promise/promise_server.js:56:12)     at server/modules/notifications/notifications.module.ts:187:19     at /snap/rocketchat-server/1523/programs/server/npm/node_modules/meteor/promise/node_modules/meteor-promise/fiber_pool.js:43:40  => awaited here:     at Function.Promise.await (/snap/rocketchat-server/1523/programs/server/npm/node_modules/meteor/promise/node_modules/meteor-promise/promise_server.js:56:12)     at server/modules/streamer/streamer.module.ts:181:7     at /snap/rocketchat-server/1523/programs/server/npm/node_modules/meteor/promise/node_modules/meteor-promise/fiber_pool.js:43:40  => awaited here:     at Function.Promise.await (/snap/rocketchat-server/1523/programs/server/npm/node_modules/meteor/promise/node_modules/meteor-promise/promise_server.js:56:12)     at Subscription.<anonymous> (app/notifications/server/lib/Notifications.ts:18:19)     at packages/matb33:collection-hooks/server.js:33:71     at Meteor.EnvironmentVariable.EVp.withValue (packages/meteor.js:1257:12)     at Subscription._handler (packages/matb33:collection-hooks/server.js:33:26)     at maybeAuditArgumentChecks (packages/ddp-server/livedata_server.js:1885:12)     at packages/ddp-server/livedata_server.js:1107:9     at Meteor.EnvironmentVariable.EVp.withValue (packages/meteor.js:1257:12)     at Subscription._runHandler (packages/ddp-server/livedata_server.js:1106:60)     at Session._startSubscription (packages/ddp-server/livedata_server.js:917:9)     at Session.sub (packages/ddp-server/livedata_server.js:673:12)     at packages/ddp-server/livedata_server.js:603:43 

So far everything is fine on the version 4.8.3
but when I move to 5.x the system become instable and the main issue is that after a short time the message are greyed out on the Chat interface and you have to constantly refresh the Server to Get everything updated …

Server Setup Information

  • Version of Rocket.Chat Server: 4.8.3

  • Operating System: Ubuntu 20.04 LTS

  • Deployment Method: snap

  • Number of Running Instances: 1

  • DB Replicaset Oplog: Enabled

  • NodeJS Version: 14.18.3 - x64

  • MongoDB Version: 4.2.17

  • Proxy: nginx

  • Firewalls involved: no

  • active users < 100

Any additional Information

I tried almost everything …
from Timeout on mongodb (Env Variables … ) to rlimits to iptables and firewall
Binding Options in Mongodb Config file
I also tried resetting network and changing network adapter

Current VM Spec :
Vmware 6.7 vm
16 vcpu
16 gb ram
network adapter : e1000e (also tried vmxnet3)

Hi!

For some reason your mongo is refusing connection from the App.

Do you have any outstanding logs from MongoDB itself?

Hello @dudanogueira I really cant tell much of it - I’m not a mongo expert :frowning: - I can see though some errors here & there… I guess it’s not very usual …

Can you please have a look
I’ve just restarted the mongo service a server service with default env parameter

I’ve filtered on error lines

	Ligne   4: 2022-11-08T09:26:08+01:00 mongod.27017[349063]: 2022-11-08T09:26:08.608+0100 I  COMMAND  [conn36] command parties.rocketchat_read_receipts command: createIndexes { createIndexes: "rocketchat_read_receipts", indexes: [ { key: { roomId: 1, userId: 1, messageId: 1 }, unique: true, name: "roomId_1_userId_1_messageId_1" }, { key: { messageId: 1 }, name: "messageId_1" } ], lsid: { id: UUID("26d950ca-7a4c-4ce2-b2f4-3eb50cfebec3") }, $clusterTime: { clusterTime: Timestamp(1667895958, 1), signature: { hash: BinData(0, 0000000000000000000000000000000000000000), keyId: 0 } }, $db: "parties" } numYields:2850 ok:0 errMsg:"E11000 duplicate key error collection: parties.rocketchat_read_receipts index: roomId_1_userId_1_messageId_1 dup key: { roomId: \"2Gzrcr9m5SbAkcmTL\", userId: \"u6AQme3c4owKAduWm\", messageId: \"CWBM6hitjcnuF5GB4\" }" errName:DuplicateKey errCode:11000 reslen:587 locks:{ ParallelBatchWriterMode: { acquireCount: { r: 2852 } }, ReplicationStateTransition: { acquireCount: { w: 2853 } }, Global: { acquireCount: { r: 1, w: 2852 } }, Database: { acquireCount: { r: 1, w: 2852 } }, Collection: { acquireCount: { r: 2852, w: 1, R: 1, W: 2 } }, Mutex: { acquireCount: { r: 4 } } } flowControl:{ acquireCount: 2851, timeAcquiringMicros: 1152 } storage:{ data: { bytesRead: 307352168, timeReadingMicros: 223968 } } protocol:op_msg 3747ms
	Ligne  12: 2022-11-08T09:26:54+01:00 mongod.27017[349063]: 2022-11-08T09:26:54.905+0100 E  STORAGE  [TTLMonitor] WiredTiger error (0) [1667896014:905369][349063:0x7fa3026f8700], file:collection-184--4194113896999988117.wt, WT_CURSOR.prev: __wt_block_read_off, 271: collection-184--4194113896999988117.wt: read checksum error for 24576B block at offset 40960: calculated block checksum  doesn't match expected checksum Raw: [1667896014:905369][349063:0x7fa3026f8700], file:collection-184--4194113896999988117.wt, WT_CURSOR.prev: __wt_block_read_off, 271: collection-184--4194113896999988117.wt: read checksum error for 24576B block at offset 40960: calculated block checksum  doesn't match expected checksum
	
	Ligne  19: 2022-11-08T09:26:54+01:00 mongod.27017[349063]: - (error decoding original message: message key "MESSAGE" truncated)
	Ligne  20: 2022-11-08T09:26:54+01:00 mongod.27017[349063]: - (error decoding original message: message key "MESSAGE" truncated)

	Ligne  28: 2022-11-08T09:26:54+01:00 mongod.27017[349063]: - (error decoding original message: message key "MESSAGE" truncated)	Ligne 329: 2022-11-08T09:27:58+01:00 mongod.27017[349803]: - (error decoding original message: message key "MESSAGE" truncated)
	
	Ligne 351: 2022-11-08T09:27:58+01:00 mongod.27017[349803]: - (error decoding original message: message key "MESSAGE" truncated)
	Ligne 352: 2022-11-08T09:27:58+01:00 mongod.27017[349803]: - (error decoding original message: message key "MESSAGE" truncated)
	Ligne 353: 2022-11-08T09:27:58+01:00 mongod.27017[349803]: 2022-11-08T09:27:58.198+0100 E  STORAGE  [TTLMonitor] WiredTiger error (-31802) [1667896078:198507][349803:0x7f6f80c0a700], file:collection-184--4194113896999988117.wt, WT_CURSOR.prev: __wt_block_read_off, 285: collection-184--4194113896999988117.wt: fatal read error: WT_ERROR: non-specific WiredTiger error Raw: [1667896078:198507][349803:0x7f6f80c0a700], file:collection-184--4194113896999988117.wt, WT_CURSOR.prev: __wt_block_read_off, 285: collection-184--4194113896999988117.wt: fatal read error: WT_ERROR: non-specific WiredTiger error
	Ligne 354: 2022-11-08T09:27:58+01:00 mongod.27017[349803]: 2022-11-08T09:27:58.198+0100 E  STORAGE  [TTLMonitor] WiredTiger error (-31804) [1667896078:198540][349803:0x7f6f80c0a700], file:collection-184--4194113896999988117.wt, WT_CURSOR.prev: __wt_panic, 489: the process must exit and restart: WT_PANIC: WiredTiger library panic Raw: [1667896078:198540][349803:0x7f6f80c0a700], file:collection-184--4194113896999988117.wt, WT_CURSOR.prev: __wt_panic, 489: the process must exit and restart: WT_PANIC: WiredTiger library panic
	Ligne 357: 2022-11-08T09:27:58+01:00 mongod.27017[349803]: - (error decoding original message: message key "MESSAGE" truncated)
	Ligne 533: 2022-11-08T09:28:01+01:00 mongod.27017[350066]: 2022-11-08T09:28:01.219+0100 W  FTDC     [initandlisten] Error getting directory iterator '/sys/block': Permission denied
	Ligne 541: 2022-11-08T09:28:01+01:00 mongod.27017[350066]: 2022-11-08T09:28:01.351+0100 I  REPL     [replexec-0] New replica set config in use: { _id: "rs0", version: 3, protocolVersion: 1, writeConcernMajorityJournalDefault: true, members: [ { _id: 0, host: "localhost:27017", arbiterOnly: false, buildIndexes: true, hidden: false, priority: 1.0, tags: {}, slaveDelay: 0, votes: 1 } ], settings: { chainingAllowed: true, heartbeatIntervalMillis: 2000, heartbeatTimeoutSecs: 10, electionTimeoutMillis: 10000, catchUpTimeoutMillis: -1, catchUpTakeoverDelayMillis: 30000, getLastErrorModes: {}, getLastErrorDefaults: { w: 1, wtimeout: 0 }, replicaSetId: ObjectId('58f6e8cc5f6cbe7e8b692e33') } }

2022-11-08T09:42:43+01:00 mongod.27017[354475]: 2022-11-08T09:42:43.576+0100 E  STORAGE  [TTLMonitor] WiredTiger error (0) [1667896963:576441][354475:0x7f6112c53700], file:collection-184--4194113896999988117.wt, WT_CURSOR.prev: __wt_block_read_off, 271: collection-184--4194113896999988117.wt: read checksum error for 24576B block at offset 40960: calculated block checksum  doesn't match expected checksum Raw: [1667896963:576441][354475:0x7f6112c53700], file:collection-184--4194113896999988117.wt, WT_CURSOR.prev: __wt_block_read_off, 271: collection-184--4194113896999988117.wt: read checksum error for 24576B block at offset 40960: calculated block checksum  doesn't match expected checksum


2022-11-08T09:43:46+01:00 mongod.27017[354823]: 2022-11-08T09:43:46.845+0100 E  STORAGE  [TTLMonitor] WiredTiger error (-31802) [1667897026:845316][354823:0x7f1edffe0700], file:collection-184--4194113896999988117.wt, WT_CURSOR.prev: __wt_block_read_off, 285: collection-184--4194113896999988117.wt: fatal read error: WT_ERROR: non-specific WiredTiger error Raw: [1667897026:845316][354823:0x7f1edffe0700], file:collection-184--4194113896999988117.wt, WT_CURSOR.prev: __wt_block_read_off, 285: collection-184--4194113896999988117.wt: fatal read error: WT_ERROR: non-specific WiredTiger error
2022-11-08T09:43:46+01:00 mongod.27017[354823]: 2022-11-08T09:43:46.845+0100 E  STORAGE  [TTLMonitor] WiredTiger error (-31804) [1667897026:845353][354823:0x7f1edffe0700], file:collection-184--4194113896999988117.wt, WT_CURSOR.prev: __wt_panic, 489: the process must exit and restart: WT_PANIC: WiredTiger library panic Raw: [1667897026:845353][354823:0x7f1edffe0700], file:collection-184--4194113896999988117.wt, WT_CURSOR.prev: __wt_panic, 489: the process must exit and restart: WT_PANIC: WiredTiger library panic
2022-11-08T09:43:46+01:00 mongod.27017[354823]: 2022-11-08T09:43:46.848+0100 F  -        [TTLMonitor] Fatal Assertion 50853 at src/mongo/db/storage/wiredtiger/wiredtiger_util.cpp 486
2022-11-08T09:43:46+01:00 mongod.27017[354823]: 2022-11-08T09:43:46.848+0100 F  -        [TTLMonitor] \n\n***aborting after fassert() failure\n\n
2022-11-08T09:43:46+01:00 mongod.27017[354823]: - (error decoding original message: message key "MESSAGE" truncated)
2022-11-08T09:43:47+01:00 systemd[1]: snap.rocketchat-server.rocketchat-mongo.service: Main process exited, code=killed, status=6/ABRT
2022-11-08T09:43:47+01:00 systemd[1]: snap.rocketchat-server.rocketchat-mongo.service: Failed with result 'signal'.
2022-11-08T09:43:48+01:00 systemd[1]: snap.rocketchat-server.rocketchat-mongo.service: Scheduled restart job, restart counter is at 17.
2022-11-08T09:43:48+01:00 systemd[1]: Stopped Service for snap application rocketchat-server.rocketchat-mongo.
2022-11-08T09:43:48+01:00 systemd[1]: Starting Service for snap application rocketchat-server.rocketchat-mongo...
2022-11-08T09:43:48+01:00 rocketchat-server.rocketchat-mongo[355181]: about to fork child process, waiting until server is ready for connections.
2022-11-08T09:43:48+01:00 rocketchat-server.rocketchat-mongo[355182]: forked process: 355183

2022-11-08T09:44:53+01:00 mongod.27017[355426]: 2022-11-08T09:44:53.028+0100 W  FTDC     [initandlisten] Error getting directory iterator '/sys/block': Permission denied
2022-11-08T09:44:53+01:00 mongod.27017[355426]: 2022-11-08T09:44:53.028+0100 I  FTDC     [initandlisten] Initializing full-time diagnostic data capture with directory '/var/snap/rocketchat-server/common/diagnostic.data'


This message leads me to believe an improper shutdown rendered your database in some corruption.

You will need to force it to repair:

However… I am not sure how you could do this in snap =\

maybe turning mongodb of, and issueing the commands from that doc may be sufficient.

Hello @dudanogueira
Again thank you for your help
This is really driving me crazy …This text will be hidden

This is what I did
Stopped the snap services mongo + server
used the mongod bin in /snap/rocketchat-server/1523/bin/ to issue a repair with dbpath as in mongo config file.

./mongod --dbpath /var/snap/rocketchat-server/common --repair

the overall process went well I should say, many “sucess” but I got few error and fail

after that I tried to restart the services but the mongodb wouldnt kickoff

2022-11-09T23:35:33+01:00 mongod.27017[1004329]: 2022-11-09T23:35:33.904+0100 F  -        [initandlisten] \n\n***aborting after fassert() failure\n\n
2022-11-09T23:35:33+01:00 rocketchat-server.rocketchat-mongo[1004327]: ERROR: child process failed, exited with error number 14
2022-11-09T23:35:33+01:00 rocketchat-server.rocketchat-mongo[1004327]: To see additional information in this output, start without the "--fork" option.
2022-11-09T23:35:33+01:00 rocketchat-server.rocketchat-mongo[1004302]: [ERROR] mongo server start failed
2022-11-09T23:35:34+01:00 rocketchat-server.rocketchat-mongo[1004390]: 2022-11-09T23:35:34.269+0100 I  CONTROL  [main] Automatically disabling TLS 1.0, to force-enable TLS 1.0 specify --sslDisabledProtocols 'none'
2022-11-09T23:35:34+01:00 rocketchat-server.rocketchat-mongo[1004390]: 2022-11-09T23:35:34.273+0100 W  ASIO     [main] No TransportLayer configured during NetworkInterface startup
2022-11-09T23:35:34+01:00 rocketchat-server.rocketchat-mongo[1004390]: There doesn't seem to be a server running with dbpath: /var/snap/rocketchat-server/common
2022-11-09T23:35:34+01:00 rocketchat-server.rocketchat-mongo[1004368]: [ERROR] mongo server shutdown failed
2022-11-09T23:35:34+01:00 systemd[1]: snap.rocketchat-server.rocketchat-mongo.service: Succeeded.
2022-11-09T23:35:34+01:00 systemd[1]: Started Service for snap application rocketchat-server.rocketchat-mongo.

Hello RC Community, @aaron.ogle @dudanogueira @debdut.chakraborty
I hope this is not an abusing “@” but I really need some help on this one… :slight_smile:

Problems on my db instance keep piling up :slight_smile:
In addition to my mongodb issues explained above - I think I may have a corrupt .wt collection file

The file is question is
collection-184–4194113896999988117.wt

Logs are as follow

nov. 23 09:39:27 STE-02001.ADS.LOCAL mongod.27017[2595129]: 2022-11-23T09:39:27.223+0100 E  STORAGE  [TTLMonitor] WiredTiger error (0) [1669192767:223424][2595129:0x7f36d77ad700], file:collection-184--4194113896999988117.wt, WT_CURSOR.prev: __wt_bm_corrupt_dump, 135: {40960, 24576, 0x534a21da}: (chunk 1 of 24): 00 00 00 00 00 00 00 00 ef 63 49 01 00 00 00 00 01 2a 01 00 b4 03 00 00 07 05 00 00 00 60 00 00 00 00 00 00 01 00 00 00 11 e3 15 31 a5 80 c0 18 98 00 00 00 02 5f 69 64 00 12 00 00 00 45 4b 43 fd 59 00 00 00 00 00 00 c1 d3 04 f0 58 42 42 67 68 57 61 33 44 32 38 46 36 50 74 00 09 69 6e 74 65 6e 64 65 64 41 74 00 80 5a 1c fe 81 01 00 00 02 6e 61 6d 65 00 26 00 00 00 47 65 6e 65 72 61 74 65 20 64 6f 77 6e 6c 6f 61 64 20 66 69 6c 65 73 20 66 6f 72 20 75 73 65 72 20 64 61 74 61 00 09 73 74 61 72 74 05 43 00 9b 0d 43 18 09 66 69 6e 69 73 68 05 14 00 9e 0d 14 24 03 72 65 73 75 6c 74 00 05 00 01 01 98 11 e3 15 31 a6 80 c0 18 98 00 00 00 02 5f 69 64 00 12 00 00 00 62 79 79 53 6b 57 50 7a 6a 36 37 78 45 36 73 70 4e 00 2e a0 00 08 40 2f 1e fe a0 00 00 60 0d 43 2e a0 00 00 64 0d 14 46 a0 00 00 a7 3e a0 00 40 41 43 32 42 54 53 52 63 6f 64 79 39 38 79 36 59 4d 32 40 01 08 00 04 20 fe a0 00 00 28 0d 43 2e a0 00 00 2f 0d 14 46 a0 00 10 a8 80 c0 0f 8f 2e 40 01 40 75 39 36 45 64 78 64 74 4e 36 69 70 4d 71 53 58 51 6a a0 00 04 1d 00 3d e0 4c 61 6e 64 20 73 61 76 65 20 73 74 61 74 69 73 74 69 63 73 00 3d d7 04 1b 06 09 da 2e 97 00 00 29 0d 14 46 97 00 00 a9 3e 37 01 40 6e 7a 4e 6b 7a 51 54 46 50 38 66 45 66 42 43 38 4a 32 97 00 08 c0 d8 21 fe 37 01 00 f3 0d 43 2e a0 00 00 f6 0d 14 46 a0 00 00 aa 3e a0 00 40 79 32 4d 69 4d 33 4e 38 64 34 34 4d 37 6f 64 70 52 32 a0 00 08 80 ad 23 fe a0 00 00 ba 0d 43 2e a0 00 00 bf 0d 14 46 a0 00 00 ab 3e a0 00 40 36 33 52 52 6d 50 36 4b 75 62 39 6d 37 39 5a 62 66 32 a0 00 08 40 82 25 fe a0 00 00 83 0d 43 2e a0 00 00 87 0d 14 46 a0 00 00 ac 3e a0 00 40 67 79 4e 66 42 34 72 51 73 70 33 77 44 50 4b 6d 48 32 a0 00 08 00 57 27 fe a0 00 00 4b 0d 43 2e a0 00 00 4e 0d 14 46 a0 00 00 ad 3e a0 00 40 4b 33 4e 6b 46 69 66 5a 67 6f 43 64 47 5a 76 78 62 32 a0 00 08 c0 2b 29 fe a0 00 04 11 2c 09 43 2e a0 00 00 14 0d 14 46 a0 00 00 ae 3e a0 00 40 51 6e 73 52 72 6a 62 71 65 54 64 57 73 32 6f 7a 46 32 a0 00 08 80 00 2b fe a0 00 00 db 0d 43 2e a0 00 00 de 0d 14 46 a0 00 00 af 3e a0 00 40 7a 36 4b 6b 71 58 48 74 53 54 63 59 69 35 6b 63 6e 32 a0 00 08 40 d5 2c fe a0 00 00 a4 0d 43 2e a0 00 00 a7 0d 14 46 a0 00 00 b0 3e a0 00 40 69 65 53 33 61 74 6e 47 59 4e 5a 64 7a 78 34 45 77 32 a0 00 08 00 aa 2e fe a0 00 00 6c 0d 43 2e a0 00 00 6f 0d 14 46 a0 00 00 b1 3e a0 00 40 59 76 32 39 62 71 43 54 41 57 4d 35 32 71 58 53 6d 32 a0 00 08 c0 7e 30 fe a0 00 04 3b 7f 09 43 2e a0 00 00 3e 0d 14 46 a0 00 00 b2 3e a0 00 40 6f 34 6a 65 4b 62 65 6b 78 75 4a 45 74 6d 69 38 68 32 a0 00 08 80 53 32 fe a0 00 04 02 54 09 43 2e a0 00 00 05 0d 14 46 a0 00 00 b3 3e a0 00 40 42 33 50 61 54 47 62 74 61 64 75 35 78 33 6e 67 63 32 a0 00 08 40 28 34 fe a0 00 00 c9 0d 43 2e a0 00 00 cc 0d 14 46 a0 00 00 b4 3e a0 00 40 68 34 6f 69 79 34 50 59 46 54 4c 38 73 48 7a 43 58 32 a0 00 08 00 fd 35 fe a0 00 00 90 0d 43 2e a0 00 00 93 0d 14 46 a0 00 00 b5 3e a0 00 40 53 5a 36 46 72 47 6a 45 39 71 6b 43 6b 47 73 66 64 32 a0 00 08 c0 d1 37 fe a0 00 04 57 d2 09 43 2e a0 00 00 5a 0d 14 46 a0 00 00 b6 3e a0 00 40 7a 68 72 48 41 69 70 64 61 38 69 4c 48 54 53 73 6f 32 a0 00  Raw: [1669192767:223424][2595129:0x7f36d77ad700], file:collection-184--4194113896999988117.wt, WT_CURSOR.prev: __wt_bm_corrupt_dump, 135: {40960, 24576, 0x534a21da}: (chunk 1 of 24): 00 00 00 00 00 00 00 00 ef 63 49 01 00 00 00 00 01 2a 01 00 b4 03 00 00 07 05 00 00 00 60 00 00 00 00 00 00 01 00 00 00 11 e3 15 31 a5 80 c0 18 98 00 00 00 02 5f 69 64 00 12 00 00 00 45 4b 43 fd 59 00 00 00 00 00 00 c1 d3 04 f0 58 42 42 67 68 57 61 33 44 32 38 46 36 50 74 00 09 69 6e 74 65 6e 64 65 64 41 74 00 80 5a 1c fe 81 01 00 00 02 6e 61 6d 65 00 26 00 00 00 47 65 6e 65 72 61 74 65 20 64 6f 77 6e 6c 6f 61 64 20 66 69 6c 65 73 20 66 6f 72 20 75 73 65 72 20 64 61 74 61 00 09 73 74 61 72 74 05 43 00 9b 0d 43 18 09 66 69 6e 69 73 68 05 14 00 9e 0d 14 24 03 72 65 73 75 6c 74 00 05 00 01 01 98 11 e3 15 31 a6 80 c0 18 98 00 00 00 02 5f 69 64 00 12 00 00 00 62 79 79 53 6b 57 50 7a 6a 36 37 78 45 36 73 70 4e 00 2e a0 00 08 40 2f 1e fe a0 00 00 60 0d 43 2e a0 00 00 64 0d 14 46 a0 00 00 a7 3e a0 00 40 41 43 32 42 54 53 52 63 6f 64 79 39 38 79 36 59 4d 32 40 01 08 00 04 20 fe a0 00 00 28 0d 43 2e a0 00 00 2f 0d 14 46 a0 00 10 a8 80 c0 0f 8f 2e 40 01 40 75 39 36 45 64 78 64 74 4e 36 69 70 4d 71 53 58 51 6a a0 00 04 1d 00 3d e0 4c 61 6e 64 20 73 61 76 65 20 73 74 61 74 69 73 74 69 63 73 00 3d d7 04 1b 06 09 da 2e 97 00 00 29 0d 14 46 97 00 00 a9 3e 37 01 40 6e 7a 4e 6b 7a 51 54 46 50 38 66 45 66 42 43 38 4a 32 97 00 08 c0 d8 21 fe 37 01 00 f3 0d 43 2e a0 00 00 f6 0d 14 46 a0 00 00 aa 3e a0 00 40 79 32 4d 69 4d 33 4e 38 64 34 34 4d 37 6f 64 70 52 32 a0 00 08 80 ad 23 fe a0 00 00 ba 0d 43 2e a0 00 00 bf 0d 14 46 a0 00 00 ab 3e a0 00 40 36 33 52 52 6d 50 36 4b 75 62 39 6d 37 39 5a 62 66 32 a0 00 08 40 82 25 fe a0 00 00 83 0d 43 2e a0 00 00 87 0d 14 46 a0 00 00 ac 3e a0 00 40 67 79 4e 66 42 34 72 51 73 70 33 77 44 50 4b 6d 48 32 a0 00 08 00 57 27 fe a0 00 00 4b 0d 43 2e a0 00 00 4e 0d 14 46 a0 00 00 ad 3e a0 00 40 4b 33 4e 6b 46 69 66 5a 67 6f 43 64 47 5a 76 78 62 32 a0 00 08 c0 2b 29 fe a0 00 04 11 2c 09 43 2e a0 00 00 14 0d 14 46 a0 00 00 ae 3e a0 00 40 51 6e 73 52 72 6a 62 71 65 54 64 57 73 32 6f 7a 46 32 a0 00 08 80 00 2b fe a0 00 00 db 0d 43 2e a0 00 00 de 0d 14 46 a0 00 00 af 3e a0 00 40 7a 36 4b 6b 71 58 48 74 53 54 63 59 69 35 6b 63 6e 32 a0 00 08 40 d5 2c fe a0 00 00 a4 0d 43 2e a0 00 00 a7 0d 14 46 a0 00 00 b0 3e a0 00 40 69 65 53 33 61 74 6e 47 59 4e 5a 64 7a 78 34 45 77 32 a0 00 08 00 aa 2e fe a0 00 00 6c 0d 43 2e a0 00 00 6f 0d 14 46 a0 00 00 b1 3e a0 00 40 59 76 32 39 62 71 43 54 41 57 4d 35 32 71 58 53 6d 32 a0 00 08 c0 7e 30 fe a0 00 04 3b 7f 09 43 2e a0 00 00 3e 0d 14 46 a0 00 00 b2 3e a0 00 40 6f 34 6a 65 4b 62 65 6b 78 75 4a 45 74 6d 69 38 68 32 a0 00 08 80 53 32 fe a0 00 04 02 54 09 43 2e a0 00 00 05 0d 14 46 a0 00 00 b3 3e a0 00 40 42 33 50 61 54 47 62 74 61 64 75 35 78 33 6e 67 63 32 a0 00 08 40 28 34 fe a0 00 00 c9 0d 43 2e a0 00 00 cc 0d 14 46 a0 00 00 b4 3e a0 00 40 68 34 6f 69 79 34 50 59 46 54 4c 38 73 48 7a 43 58 32 a0 00 08 00 fd 35 fe a0 00 00 90 0d 43 2e a0 00 00 93 0d 14 46 a0 00 00 b5 3e a0 00 40 53 5a 36 46 72 47 6a 45 39 71 6b 43 6b 47 73 66 64 32 a0 00 08 c0 d1 37 fe a0 00 04 57 d2 09 43 2e a0 00 00 5a 0d 14 46 a0 00 00 b6 3e a0 00 40 7a 68 72 48 41 69 70 64 61 38 69 4c 48 54 53 73 6f 32 a0 00

Can this be the main issue with my mongodb instance ?
Is there a way to repair / recover - in snap envirement ?

Again - I forgot if I mentioned it before … - I’m still not able to get a full backup of the db with

snap run rocketchat-server.backupdb

I get this error on the backup log :

Failed: error writing data for collection `parties.rocketchat_cron_history` to disk: error reading collection: connection(localhost:27017[-4]) incomplete read of message header: EOF

Please Help

@mehdi.yayaoui

First off, copy the whole data to back it up, if you haven’t already :slight_smile:

Secondly, drop the cron_history collection, then try again.

2 Likes

It’s magic !!
that did the trick

Backup ok
I was able to refresh to 5.xx with no issues so far

many thanks @debdut.chakraborty

1 Like