How can I run multiple curl requests processed sequentially?
It would most likely process them sequentially (why not just test it). But you can also do this:
make a file called
curlrequests.sh
put it in a file like thus:
curl http://example.com/?update_=1 curl http://example.com/?update_=3 curl http://example.com/?update_=234 curl http://example.com/?update_=65
save the file and make it executable with
chmod
:chmod +x curlrequests.sh
run your file:
./curlrequests.sh
or
/path/to/file/curlrequests.sh
As a side note, you can chain requests with &&
, like this:
curl http://example.com/?update_=1 && curl http://example.com/?update_=2 && curl http://example.com?update_=3`
And execute in parallel using &
:
curl http://example.com/?update_=1 & curl http://example.com/?update_=2 & curl http://example.com/?update_=3
Another crucial method not mentioned here is using the same TCP connection for multiple HTTP requests, and exactly one curl command for this.
This is very useful to save network bandwidth, client and server resources, and overall the need of using multiple curl commands, as curl by default closes the connection when end of command is reached.
Keeping the connection open and reusing it is very common for standard clients running a web-app.
Starting curl version 7.36.0, the --next
or -:
command-line option allows to chain multiple requests, and usable both in command-line and scripting.
For example:
- Sending multiple requests on the same TCP connection:
curl http://example.com/?update_=1 -: http://example.com/foo
- Sending multiple different HTTP requests on the same connection:
curl http://example.com/?update_=1 -: -d "I am posting this string" http://example.com/?update_=2
- Sending multiple HTTP requests with different curl options for each request:
curl -o 'my_output_file' http://example.com/?update_=1 -: -d "my_data" -s -m 10 http://example.com/foo -: -o /dev/null http://example.com/random
From the curl manpage:
-:, --next
Tells curl to use a separate operation for the following URL and associated options. This allows you to send several URL requests, each with their own specific options, for example, such as different user names or custom requests for each.
-:
,--next
will reset all local options and only global ones will have their values survive over to the operation following the -:, --next instruction. Global options include -v, --verbose, --trace, --trace-ascii and --fail-early.For example, you can do both a GET and a POST in a single command line:
curl www1.example.com --next -d postthis www2.example.com
Added in 7.36.0.
According to the curl man page:
You can specify any amount of URLs on the command line. They will be fetched in a sequential manner in the specified order.
So the simplest and most efficient (curl will send them all down a single TCP connection [those to the same origin]) approach would be put them all on a single invocation of curl e.g.:
curl http://example.com/?update_=1 http://example.com/?update_=2
Write a script with two curl requests in desired order and run it by cron, like
#!/bin/bash
curl http://mysite.com/?update_=1
curl http://mysite.com/?the_other_thing