When the System resource i.g. Memory is consumed beyond the threshold limit, the System kills processes which are consuming too much resources. Specially Java do resource management, while running a java process. Censhare Java process needs a lot of memory. When there is not enough memory available for system, it will kill biggest process in order complete system operations.

Symptoms:

When The Censhare Java process stops writing log, a possible cause could be an Out Of Memory Issue. In this case we need to check if any Censhare Java Process was killed with below command.

dmesg -T | grep -i kill
CODE

Another way to find the cause is following error in messages or Logs.

"Out of memory: Kill process 1926 (java) score 371 or sacrifice child
 Killed process 1926 (java)"
CODE

Cause:

The censhare java process uses the most memory, it gets killed first in order to free up memory to keep the system running.

The OS has process management which monitors resource utilization and it also make sure that the process, which have higher priority never gets out of system resource. When there is lack of system resource (CPU,RAM,Disk Space) in this situation the system stops or kills process which have less priority compared to other process. In Most of case, the System kills Application process to complete it's own System operation.

When the Service Client is running on the same machine, then it leads to a bottle neck of resource, as it starts image converting processes .When there is no Service Client connected at all, by default censhare-Server would do the image converting on it’s own, which also causes higher usage of memory.

Note:

Every process has it's own score for killing process which can be checked in /proc/<pid>/oom_score. This score defines that process has higher priority to kill other process.

This file can be used to check the current score used by the oom-killer is for any given <pid>. Use it together with /proc/<pid>/oom_score_adj to tune which process should be killed in an out-of-memory situation.

Solution:

1. Verify if Service-Client is connected and runs on another server, this is recommended.

2. if image previews run on the same server, verify the policy.xml for ImageMagick and that it has following settings, this is already the default in later ImageMagick RPMs we provide. On update please delete customised settings in policy.xml in order to get it overwritten on update.

'<policy domain="resource" name="memory" value="2GiB">'
CODE
'<policy domain="resource" name="map" value="2GiB">'
CODE

3. Simplest of all solution is add more RAM to your Server so that it won't face resource bottle Neck.