Analyze performance issues in OpenEMS

To get a better understanding of the Bridge architecture I created a new CAN bridge bundle which is almost a 1:1 copy of the ModbusTCP bundle. The bundle works fine and in the CAN transaction it simply sets some dummy values (without doing any real CAN communication). When I start the Edge application I see the CPU load going up to 25%. The OSGI console extremly slows down and also the OpenEMS UI is unusable.

How do I detect the reason for the load going up? Attaching to the application with JProfiler and analyzing the result does not help.

Are there any tips and tricks on how to profile/analyze the performance of the OpenEMS framework or a single bundle?

The Modbus module isn’t threaded, and executes everything on a single thread. I can’t speak to the threaded-ness of the OpenEMS itself, but I suspect everything is executing on a single core on your Edge instance. You could try offloading serial I/O to a separate thread or worker, but if that doesn’t help, there may be an issue with your Java setup not exploiting all your CPU cores. I’d maybe check for the latter first.

I threaded serial operations for MCCommsBridge, a module that interfaces via serial comms using a custom protocol. You can see it here, but the code is a tad clumsy:
If I were to do it over, I’d probably opt for just one thread and use event-driven serial packet handling.

Hope this helps.

Hello Kyle, thanks for you suggestions and your help. My question was less about my specific problem. I am used to use a java profiler in a pure java application to find bottlenecks.That seems not to work on OpenEMS. So I was interested in a generic OSGI profiling tool. I can’t find one so I thought someone here has experience with that.

Regarding my specific problem: I finally solved it. For testing purpose I replaced the modbus implementation in an existing OpenEMS battery module with my CAN implementation.
Because there was no “physical batteries” connected to the OpenEMS battery module the cpu load increased massively.