I have an app that runs several threads in the background. On windows and mac, it doesn't use any CPU usage when 'idle', but in linux it always sits at 100% all the time. Most of the threads are sitting on tryAcquire mutex locks with 250ms timeouts, and also have 250ms sleeps in the loops in the run function. Any ideas what could cause the CPU usage only on the linux host and any ideas on how to fix it?
Thanks!
Bookmarks