SSH Syn requests


I’m a network admin that has noticed that all my rocket chat clients are sending SYN requests to the rocket chat server on port 22. We have the port closed on the rocket chat server we host in AWS, but that doesn’t stop the constant attempts. Anyone else notice this? And were you successful in getting the clients to stop?

It’s sending requests so often that they amount to 5 or 6 GB per client per day. It’s getting to be a bit much and I would like to cut out the unnecessary traffic.


Are you sure these are requests from RC clients and not the general noise on port 22?


Yes, the requests are direct to the rocketchat server and there is no other SSH traffic. If you exit rocketchat, the SYN requests stop.


Just checked that on my test server, on which ufw with high logging is running. I don’t see requests on port 22, when connecting with desktop client nor with browser.


Hmm. It’s weird though. The only thing being sent is SYN requests. The netstat -a -b shows the port designated to the rocket chat client/server connection, and it’s a dynamic port internally to port 22.


I disabled the ufw firewall and even a tcpdump -i eth0 port 22 does not show any connects on port 22, using the desktop client.


It’s just spamming SYN packets to the SSH port on the server. It changes the dynamic port, but there is never a response and the server is not responding. I’m using wireshark to help isolate any other associated traffic. Any insight would be appreciated.


Rocket.Chat its self doesn’t do anything out of the box with port 22. Definitely nothing speaking ssh either.

My thought would be maybe someone sent a link like: yourserver:22 and the client is attempting to resolve? Or you have some custom javascript running that has each client trying to login. Could be a number of things but for sure not out of the box.

I’d open in a browser window and see if still happens. If so open the dev tools and switch to the network tab and see if you see the requests being made to your server at port 22


So, Wireshark returned something fun. Turns out the clients are negotiating SSH handshakes and transferring files. Several hundred times a day. I’m going to use this to force them to give me access to the Rocket Chat server so I can take a look. Because I was ASSURED that the server was denying the connection.


Are the connections coming from the clients (Rocket.Chat clients) or from the server Rocket.Chat is sitting on?

If coming from the Clients. The suggested advice above should work out fine and yield some results.

If coming from the server… that’s a whole other set of possibilities. Chief among those being compromised server that Rocket.Chat is sitting on.


It looks to be initiating from the client each time. Going to take a few more captures while i’m waiting for server access.


I know a network admin likes falling back to the tools like wireshark but would be good to see if its related to Rocket.Chat desktop client or if its initiating from the web app. Opening in browser would show this.

Also if happening in browser you’ll be able to see the actual requests in the dev tools.

In chrome right clicking the page and inspect element… then switching to network tab should show you something like this:

If its happening on both and you see activity there logging into the admin section of Rocket.Chat you can take a look at custom javascript and probably see the cause


It just keeps creating the xhr requests. I am still waiting to get admin access to the server to locate cause. Though I suspect that the xhr and xhr-send requests mimic the sshv2 traffic I’m seeing.



This error shows up at launch.


wss:// traffic should be retrying on port 443 - port 22 is still a mystery :thinking:


Yeah both of these are a symptom of websocket support not being included in your reverse proxy. Its trying to upgrade a connection from http to a websocket and failing. So it falls back to polling.

Fixing the reverse proxy / load balancer will keep it from doing those.

but that being said… unless your server is running at https://chat.yourdomain:22 it for sure wouldn’t be hitting port 22.

So if that’s the case it’s not Rocket.Chat causing the traffic to port 22.

This is definitely an interesting case… please keep us updated. Would love to see what is causing this


2 posts were split to a new topic: Nginx config for websocket support


@MrBilltheITGuy. Any conclusions on this mystery?


Yes, would like to know too.


TL;DR - netstat assigned the wrong application to the ports in use.

Basically, the netstat incorrectly identified rocket chat as the application using the ports and connection. In reality, it is a new screen capture program (that isn’t identified at all in netstat) that was sending the recorded screen footage to cloud storage via ssh. So, false alarm, people.

I’m still trying to figure out why the traffic was dependent on the rocket chat being up in an application or browser. It’s possible that the machine thinks that the capture software is integrated and is not capturing data when chat is not open. This would explain why the clients differ in the amount of SSH traffic each day, as well as some stations sending more data than others even though there is a user there for the same amount of time. Our users are not all using Rocket Chat at the same degree and some do not keep the application open all day, it seems. Still puzzling that one out.

Thanks everyone for the help!! And for he reminder to tell what happened. I hate not seeing the resolution when I’m searching forums like this. :stuck_out_tongue: