This piece of code tells the CURL library where to put what its gathering:
curl_easy_setopt(curl_handle, CURLOPT_WRITEDATA, (void *)&chunk);
curl_easy_setopt(curl_handle, CURLOPT_WRITEDATA, (void *)&chunk);
To copy to clipboard, switch view to plain text mode
This piece of code tells it the callback function to use:
curl_easy_setopt(curl_handle, CURLOPT_WRITEFUNCTION, WriteMemoryCallback);
curl_easy_setopt(curl_handle, CURLOPT_WRITEFUNCTION, WriteMemoryCallback);
To copy to clipboard, switch view to plain text mode
Once you execute curl exec it does whatever the options tell it to do, and calls the callback function whenever the remote server sends back data. Im pretty sure its safe as copying the pointer is what the examples in the CURL install tarball show, and I haven't had any problems with it.
The main problem I face is that I am pulling in several megabytes of data, which normally would append to the data already in memory from the request. Rather than waste performance on pulling all of that data into memory I wanted to be able to look at the chunks as they come in, pull out what I want out of them, and pretty much throw away the rest.
Since the callback function is a static function, I am not sure how I can make the data that is processed in it available to my QT app outside of that function.
I did try the static member approach as marcel suggested, which compiled fine, however the program crashed when it got to the point that it started using that variable. Do you have an example of how this would work?
Bookmarks