This is where you use a technique known as "arithmetic". For example: You have 100 clients sending requests once every two seconds. One client means 0.5 requests per second, 100 clients mean 100 times that, which is 50 requests per second.
Next piece of arithmetic: how long will the server take to process those requests? First we need to know how long it will take to process one request. And we don't know that yet. So over to you... how long does your server take to process one such request?
posted 7 years ago
i dont know but this the ENV inforamation i have
shared hosting environment
Server Speed & Reliability
-- RedHat Enterprise Linux 4 or 5
-- Dual DS-3 connections with auto-routing
-- OC48 SONET ring backbone
-- dual Xeon 3ghz processors
-- 2GB DDR RAM
-- 73GB SCSI hard drives w/ RAID5 configuration
Happily living in the valley of the dried frogs with a few tiny ads.
Sauce Labs - World's Largest Continuous Testing Cloud for Websites and Mobile Apps