NGINX workers not enough

connectioncurlnginx

I'm not very familiar with nginx workers so I'll need some help here. I have two sites on one server

Second one is used to make API call through first one and receive json array. This is the scenario.

In http://example-second.com/api_call.php file I have

function get_curl($url) {
    $curl = curl_init();
    curl_setopt($curl, CURLOPT_URL, $url);
    curl_setopt($curl, CURLOPT_RETURNTRANSFER, true);
    curl_setopt($curl, CURLOPT_HEADER, false);
    $result = curl_exec($curl);
    curl_close($curl);
    return $result;
}
$url=get_curl("http://example.com/api.php?ID=foo");

In http://example.com/api.php

function get_curl_from_api($url) {
    $curl = curl_init();
    curl_setopt_array($curl, array(
        CURLOPT_RETURNTRANSFER => 1,
        CURLOPT_URL => $url,
        CURLOPT_HEADER=> 0,
        CURLOPT_RETURNTRANSFER=> 0,
        CURLOPT_SSL_VERIFYPEER=> 1,
        CURLOPT_SSL_VERIFYHOST=>2
    ));
    $resp = curl_exec($curl);   
      curl_close($curl);
      return $resp;    
}

$url=get_curl_from_api("https://thirdpartsite.com/".$_GET["ID"]); 

From time to time I receive on example-second site NULL with error in log

32224#32224: 10 worker_connections are not enough

I know what this mean since in nginx.conf I have

worker_processes auto;

events {
    worker_connections 10;
    multi_accept on;
    use epoll;
}

My question is how workers work exactly because why this happen from time to time? Is there some cache or keep-alive session or something? And another question is it possible to get some workaround on this without increase worker_connections? Perhaps some delay on curl function?

Best Answer

Nginx have asynchronous event-driven architecture, single worker is processing multiple connections. worker_connections is maximum number of connection that worker process can open simultaneously. Not just client, but also proxy connections etc. You have this option set to very low value 10. At least you should reset it to default value 512.