parallel processing - PHP receive and respond to thousands of requests - do calculations behind the scene -
i working on php script that:
- receives client request;
- processes request via cpu-and-time-intensive-binary-computation
- store result of computation mysql database
- then respond client
status 200 ok
problem: when there 10s of 1000s of requests coming in per second during peak hours: clients have wait long time receive status 200 ok
.
flexibilities: script not need respond client result of computation. script not need respond status 200 ok
based on success/failure of computation - computation may fail , that's okay. actual computation happen in parallel behind scene.
what tools / packages / libraries / strategies should used achieve kind of intensive request handling design on php? on php side or solvable apache side?
notes:
- running apache, mysql, php, redis on ubuntu [ampru]
- clients send request , receive
status 200 ok
right away. - clients not wait computation of request complete.
- there no auto-scaling or load-balancing concept in place: it's single ampru server.
- better if multiple computations can happen in parallel behind scenes
this classic use case queue. of tech-stack have listed, redis has support queues (check out php-resque library), or there other tools can used, such beanstalkd (a favourite of mine, pheanstalk php library), or amazon sqs. there number of other options, both self-hosted., or available services.
the website, or other mechanism receives data, , queue - returning 200 ok. back-end workers, simple cron-based system, or (better) multiple long-running (occasionally restarting clean-up) scripts, pull items queue , performs work, saving results.
i've literally run hundreds of millions of jobs this, through such systems. workers, if can reach queue , database servers, don't have run on same machines (i running dozens of workers across many servers).
Comments
Post a Comment