Overview
Agents may compete with other applications for limited Random Access Memory (RAM). Through the JetPatch Agent Manager, you can limit the amount of RAM that each managed agent is allowed to consume.
| Note: In addition to the parent process, all child processes are monitored, so they can all be throttled. |
RAM throttling can be configured for all managed agents under the Intigua Connector and the Managed Agents by navigating to Agents & Tools → Tools Catalog on the left side of the interface.
Limit RAM for All Managed Agents (Connector Level)
To limit the total RAM consumption of all agents managed under a specific connector:
- Select the Connector Icon and navigate to the Services tab.
- Select the connector you wish to manage.
- Navigate to the CPU and Memory Control section.
It should resemble this image:
Limit RAM for a Specific Managed Agent
To limit the RAM consumption of a specific managed agent:
- In the Agents section, choose a specific agent (rather than the Intigua Connector).
- Navigate to the CPU and Memory Control section — the same navigation as above.
It should resemble this image:
Limit RAM for a Specific Managed Agent
To limit the RAM consumption of a specific managed agent:
- In the Agents section, choose a specific agent (rather than the Intigua Connector).
- Navigate to the CPU and Memory Control section — the same navigation as above.
Equal Throttling Across Agents — Example Scenario
For each endpoint, all managed agents will be throttled equally to meet the configured endpoint RAM consumption threshold.
Configuration:
- Intigua Connector RAM limit: 512MB
Limits configured for the managed agents:
| Agent | Configured Limit |
| Agent 1 | 256MB |
| Agent 2 | 256MB |
| Agent 3 | 256MB |
| Agent 4 | 256MB |
If in real-time, the actual RAM consumption is:
| Agent | Actual Usage |
| Agent 1 | 512MB |
| Agent 2 | 512MB |
| Agent 3 | 512MB |
| Agent 4 | 512MB |
| Result: All agents will be throttled equally to their configured limits — each at 256MB. |
Comments
0 comments
Please sign in to leave a comment.