LXC Out of memory - Containers using up memory!

,

Hi All,

I am having issues with my containers mostly using up all VM memory (32Gb RAM), even though I have set the limit to 16gb in the containers. 3 of the containers (each with tomcat) connect to a Postgres container, the web applications in each of those 3 containers start of well, but then start experiencing oom errors, thus failing. Please help, below are the logs.

/var/log/kern.log:Jul 15 15:38:50 DHIS2devVM kernel: [ 2542.490039] java.d invoked oom-killer: gfp_mask=0x100cca(GFP_HIGHUSER_MOVABLE), order=0, oom_score_adj=0
/var/log/kern.log:Jul 15 15:38:50 DHIS2devVM kernel: [ 2542.490058] oom_kill_process+0xe6/0x120
/var/log/kern.log:Jul 15 15:38:50 DHIS2devVM kernel: [ 2542.490471] oom-kill:constraint=CONSTRAINT_NONE,nodemask=(null),cpuset=/,mems_allowed=0,global_oom,task_memcg=/user/root/0,task=java.d,pid=27440,uid=0
/var/log/kern.log:Jul 15 15:38:50 DHIS2devVM kernel: [ 2542.490499] Out of memory: Killed process 27440 (java.d) total-vm:2626936kB, anon-rss:594576kB, file-rss:0kB, shmem-rss:0kB, UID:0 pgtables:1340kB oom_score_adj:0
/var/log/kern.log:Jul 15 15:39:20 DHIS2devVM kernel: [ 2571.718189] lxc invoked oom-killer: gfp_mask=0x100cca(GFP_HIGHUSER_MOVABLE), order=0, oom_score_adj=0
/var/log/kern.log:Jul 15 15:39:20 DHIS2devVM kernel: [ 2571.718209] oom_kill_process+0xe6/0x120
/var/log/kern.log:Jul 15 15:39:20 DHIS2devVM kernel: [ 2571.718621] oom-kill:constraint=CONSTRAINT_NONE,nodemask=(null),cpuset=/,mems_allowed=0,global_oom,task_memcg=/lxc/dev/system.slice/tomcat9.service,task=java,pid=27746,uid=100997
/var/log/kern.log:Jul 15 15:39:20 DHIS2devVM kernel: [ 2571.718681] Out of memory: Killed process 27746 (java) total-vm:5315692kB, anon-rss:837324kB, file-rss:0kB, shmem-rss:0kB, UID:100997 pgtables:2008kB oom_score_adj:0
/var/log/kern.log:Jul 15 15:39:30 DHIS2devVM kernel: [ 2581.083685] munin-node invoked oom-killer: gfp_mask=0x100cca(GFP_HIGHUSER_MOVABLE), order=0, oom_score_adj=0
/var/log/kern.log:Jul 15 15:39:30 DHIS2devVM kernel: [ 2581.083705] oom_kill_process+0xe6/0x120
/var/log/kern.log:Jul 15 15:39:31 DHIS2devVM kernel: [ 2581.084136] oom-kill:constraint=CONSTRAINT_NONE,nodemask=(null),cpuset=/,mems_allowed=0,global_oom,task_memcg=/user/root/0,task=java.d,pid=9331,uid=0
/var/log/kern.log:Jul 15 15:39:31 DHIS2devVM kernel: [ 2581.084157] Out of memory: Killed process 9331 (java.d) total-vm:3031224kB, anon-rss:588kB, file-rss:0kB, shmem-rss:0kB, UID:0 pgtables:700kB oom_score_adj:0
/var/log/kern.log:Jul 15 15:39:37 DHIS2devVM kernel: [ 2587.665646] munin-node invoked oom-killer: gfp_mask=0x100cca(GFP_HIGHUSER_MOVABLE), order=0, oom_score_adj=0
/var/log/kern.log:Jul 15 15:39:37 DHIS2devVM kernel: [ 2587.665665] oom_kill_process+0xe6/0x120
/var/log/kern.log:Jul 15 15:39:37 DHIS2devVM kernel: [ 2587.666064] oom-kill:constraint=CONSTRAINT_NONE,nodemask=(null),cpuset=postgres,mems_allowed=0,global_oom,task_memcg=/user/root/0,task=java.d,pid=27985,uid=0
/var/log/kern.log:Jul 15 15:39:37 DHIS2devVM kernel: [ 2587.666094] Out of memory: Killed process 27985 (java.d) total-vm:2626936kB, anon-rss:276784kB, file-rss:0kB, shmem-rss:0kB, UID:0 pgtables:880kB oom_score_adj:0
/var/log/kern.log:Jul 15 15:39:44 DHIS2devVM kernel: [ 2594.470712] munin-node invoked oom-killer: gfp_mask=0x100cca(GFP_HIGHUSER_MOVABLE), order=0, oom_score_adj=0
/var/log/kern.log:Jul 15 15:39:44 DHIS2devVM kernel: [ 2594.470730] oom_kill_process+0xe6/0x120
/var/log/kern.log:Jul 15 15:39:44 DHIS2devVM kernel: [ 2594.471127] oom-kill:constraint=CONSTRAINT_NONE,nodemask=(null),cpuset=postgres,mems_allowed=0,global_oom,task_memcg=/lxc/dev/system.slice/tomcat9.service,task=java,pid=28147,uid=100997
/var/log/kern.log:Jul 15 15:39:44 DHIS2devVM kernel: [ 2594.471189] Out of memory: Killed process 28147 (java) total-vm:5313644kB, anon-rss:440148kB, file-rss:0kB, shmem-rss:0kB, UID:100997 pgtables:1368kB oom_score_adj:0
/var/log/kern.log:Jul 15 15:39:50 DHIS2devVM kernel: [ 2602.412215] java.d invoked oom-killer: gfp_mask=0x100dca(GFP_HIGHUSER_MOVABLE|__GFP_ZERO), order=0, oom_score_adj=0
/var/log/kern.log:Jul 15 15:39:51 DHIS2devVM kernel: [ 2602.412235] oom_kill_process+0xe6/0x120
/var/log/kern.log:Jul 15 15:39:51 DHIS2devVM kernel: [ 2602.412623] oom-kill:constraint=CONSTRAINT_NONE,nodemask=(null),cpuset=/,mems_allowed=0,global_oom,task_memcg=/user/root/0,task=java.d,pid=28006,uid=0
/var/log/kern.log:Jul 15 15:39:51 DHIS2devVM kernel: [ 2602.412652] Out of memory: Killed process 28006 (java.d) total-vm:2626936kB, anon-rss:531680kB, file-rss:0kB, shmem-rss:0kB, UID:0 pgtables:1432kB oom_score_adj:0
/var/log/kern.log:Jul 15 15:40:34 DHIS2devVM kernel: [ 2645.905731] GUsbEventThread invoked oom-killer: gfp_mask=0x100cca(GFP_HIGHUSER_MOVABLE), order=0, oom_score_adj=0
/var/log/kern.log:Jul 15 15:40:34 DHIS2devVM kernel: [ 2645.905748] oom_kill_process+0xe6/0x120
/var/log/kern.log:Jul 15 15:40:34 DHIS2devVM kernel: [ 2645.906045] oom-kill:constraint=CONSTRAINT_NONE,nodemask=(null),cpuset=postgres,mems_allowed=0,global_oom,task_memcg=/lxc/dev/system.slice/tomcat9.service,task=java,pid=28234,uid=100997
/var/log/kern.log:Jul 15 15:40:34 DHIS2devVM kernel: [ 2645.906114] Out of memory: Killed process 28234 (java) total-vm:5315692kB, anon-rss:631528kB, file-rss:0kB, shmem-rss:0kB, UID:100997 pgtables:1824kB oom_score_adj:0
Binary file /var/log/kern.log matches

The Linux VMs free -m:
free -m
total used free shared buff/cache available
Mem: 32117 31737 262 0 117 53
Swap: 1023 1023 0

Whereas the container is as below:
sudo lxc exec dev – free -gh
total used free shared buff/cache available
Mem: 14Gi 111Mi 14Gi 0.0Ki 155Mi 14Gi
Swap: 1.0Gi 0B 1.0Gi

Thanks!

From memory Java tries to consume as much memory as it can - have you tried setting a limit on the max ammount of memory it can consume? I think it tries to allocate as much as it can on init!

And when i say “consume” i mean “allocate” on the initialization of the JVM. (I.E the start of your app)

I could be totally wrong - but if you limit the app to say 4GB on each (not through LXD but through your init script) - does it fix the issue?

Could you please guide me on that? Inside the containers (tomcat settings), I set the Java heap size as follows:

JAVA_OPTS=‘-Djava.awt.headless=true -XX:+UseG1GC -Xmx2G -Xms1G -Djava.security.egd=file:/dev/./urandom’

Note that the OOM killer is cgroup/namespace aware. So your container running out of memory will trigger the OOM killer which will then look for something to kill within that container.

That’s different than when you have no limit in place where the global OOM killer triggers and may kill something in an unrelated container or on the host itself.

As mentioned by someone else above, Java can be a bit weird about memory allocation and so tweaking the /etc/default/XYZ file to pass additional options to the JVM is often the way to go.

Still having trouble despite passing the enviroment variables to the profile:

config:
environment.JAVA_HOME: /usr/lib/jvm/java-8-openjdk-amd64
environment.JAVA_OPTS: ’ -Djava.awt.headless=true -XX:+UseG1GC -Xmx12G -Xms12G -Djava.security.egd=file:/dev/./urandom’

Memory on the host still gets used up:
free -m
total used free shared buff/cache available
Mem: 32117 31708 292 0 116 83
Swap: 6143 5785 358

Note that if your service is started by systemd, the variables in environment.* will be ignored as it strips its environment.