Load Stress Test & Traffic Generator avatar
Load Stress Test & Traffic Generator

Pricing

$15.00/month + usage

Go to Store
Load Stress Test & Traffic Generator

Load Stress Test & Traffic Generator

Developed by

Onidivo Technologies

Onidivo Technologies

Maintained by Community

Perform load and stress tests against your website. Drive traffic and test your website tools and integrations.

4.0 (1)

Pricing

$15.00/month + usage

5

Total users

254

Monthly users

17

Runs succeeded

98%

Last modified

a year ago

You can access the Load Stress Test & Traffic Generator programmatically from your own applications by using the Apify API. You can also choose the language preference from below. To use the Apify API, you’ll need an Apify account and your API token, found in Integrations settings in Apify Console.

{
"mcpServers": {
"apify": {
"command": "npx",
"args": [
"mcp-remote",
"https://mcp.apify.com/sse?actors=onidivo/load-stress-test",
"--header",
"Authorization: Bearer <YOUR_API_TOKEN>"
]
}
}
}

Configure MCP server with Load Stress Test & Traffic Generator

You have a few options for interacting with the MCP server:

  • Use mcp.apify.com via mcp-remote from your local machine to connect and authenticate using OAuth or an API token (as shown in the JSON configuration above).

  • Set up the connection directly in your MCP client UI by providing the URL https://mcp.apify.com/sse?actors=onidivo/load-stress-test along with an API token (or use OAuth).

  • Connect to mcp.apify.com via Server-Sent Events (SSE), as shown below:

{
"mcpServers": {
"apify": {
"type": "sse",
"url": "https://mcp.apify.com/sse?actors=onidivo/load-stress-test",
"headers": {
"Authorization": "Bearer <YOUR_API_TOKEN>"
}
}
}
}

You can connect to the Apify MCP Server using clients like Tester MCP Client, or any other MCP client of your choice.

If you want to learn more about our Apify MCP implementation, check out our MCP documentation. To learn more about the Model Context Protocol in general, refer to the official MCP documentation or read our blog post.