PHP curl_multi_getcontent returns null

I'm wondering:

$i = 0;
while ($row = $result->fetch_object()) {
    $data = curl_multi_getcontent($ch[$i]);
    $json_data = json_decode($data);
    var_dump($json_data);

Are you forgetting to increment $i? If so, you already grabbed the content for $ch[0], and then you call curl_multi_getcontent again.

Also, I've written a blog post covering concurrent requests with PHP's cURL extension, and it contains a general function for curl multi requests. You could call this function in the following manner:

$responses = multi([
    $requests = [
        ['url' => 'https://example.com/search/username1/'],
        ['url' => 'https://example.com/search/username2/'],
        ['url' => 'https://example.com/search/username3/']
    ]
    $opts = [
        CURLOPT_CAINFO => 'cacert.pem',
        CURLOPT_USERPWD => "username:password"
    ]
]);

Then, you cycle through the responses array:

foreach ($responses as $response) {
    if ($response['error']) {
        // handle error
        continue;
    }
    // check for empty response
    if ($response['data'] === null) {
        // examine $response['info']
        continue;
    }
    // handle data
    $data = json_decode($response['data']);
    // do something
}

Using this function, you could make a simple test of accessing https sites using the following call:

multi(
    $requests = [
        'google' => ['url' => 'https://www.google.com'],
        'linkedin' => ['url'=> 'https://www.linkedin.com/']
    ],
    $opts = [
        CURLOPT_CAINFO => '/path/to/your/cacert.pem',
        CURLOPT_SSL_VERIFYPEER => true
    ]
);

I see that your execution loop is different from the one that is adviced in PHP documentation:

do {
  $mrc = curl_multi_exec($mh, $active);
} while ($mrc == CURLM_CALL_MULTI_PERFORM);

Note that in while the function return is compared, not the second parameter.

Edit: Thanks to Adam's comment I have tested both syntaxes and see that they are equal and asynchronous. Here is a working example of asynchronous multi-request with getting content into variable:

<?php
$ch = array();
$mh = curl_multi_init();
$total = 100;

echo 'Start: ' . microtime(true) . "\n";

for ($i = 0; $i < $total; $i++) {
    $ch[$i] = curl_init();
    curl_setopt($ch[$i], CURLOPT_URL, 'http://localhost/sleep.php?t=' . $i);
    curl_setopt($ch[$i], CURLOPT_HEADER, 0);
    curl_setopt($ch[$i], CURLOPT_RETURNTRANSFER, true);

    curl_multi_add_handle($mh, $ch[$i]);
}

$active = null;
do {
    $mrc = curl_multi_exec($mh, $active);
    usleep(100); // Maybe needed to limit CPU load (See P.S.)
} while ($active);

foreach ($ch AS $i => $c) {
    $r = curl_multi_getcontent($c);
    var_dump($r);
    curl_multi_remove_handle($mh, $c);
}

curl_multi_close($mh);

echo 'End: ' . microtime(true) . "\n";

And testing file sleep.php:

<?php
$start = microtime(true);

sleep( rand(3, 5) );

$end = microtime(true);

echo $_GET['t'], ': ', $start, ' - ', $end, ' - ', ($end - $start);
echo "\n";

P.S. Initial idea of using usleep inside a loop was to pause it a bit and thus reduce number of operations while cUrl waits for response. And at the beginning it seemed to work that way. But last tests with top showed a minimal difference in CPU load (17% with usleep versus 20% without it). So, I do not know whether to use it or not. Maybe tests on real server would show another results.

Edit 2: I have tested my code with making a request to password protected HTTPS page (CURLOPT_CAINFO and CURLOPT_USERPWD equal to those in the question). It works as expected. Probably there is a bug in your version of PHP or cURL. My versions are "PHP Version 5.3.10-1ubuntu3.8" and 7.22.0. They have no problems.