Is there a way to use curl interactively? Or is there an interactive curl/wget shell?
On many Linux/Unix systems, your pseudocode will just work in any shell, although your paths should really be full URLs.
For instance, on Debian-based systems, the package libwww-perl
installs three symlinks to lwp-request which are called /usr/bin/GET
, /usr/bin/HEAD
, and /usr/bin/POST
. These do what you would expect. Recent versions of OpenSuse's perl-libwww-perl
package omit the symlinks (which is probably a bug), so you would have to create them yourself or use lwp-request
directly. Generally and for many years, it has been quite a safe assumption that GET, HEAD, and POST executables are available on unixoid systems.
Of course you could also use curl
for all of these tasks, so perhaps I do not understand why you feel that a command line shell such as bash is not interactive.
Thanks for the answers.
After googling around, I found resty, which is a shell script wrapper around the curl tool. This is really what I want. It's 155 lines of shell script, and when I run it, I get functions for GET, PUT, POST, DELETE, and OPTIONS. These functions are just wrappers around the curl program found on my path.
It works like this on MacOSX bash:
$ . resty
$ resty https://api.example.org
https://api.myhost.com*
$ GET /v1/o/orgname -u myusername:password
{
"createdAt" : 1347007133508,
"createdBy" : "admin",
"displayName" : "orgname",
"environments" : [ "test", "prod" ],
"lastModifiedAt" : 1347007133508,
"lastModifiedBy" : "admin",
"name" : "orgname",
"properties" : {
"propertyList" : [ ... ]
},
}
$
The first line there just runs the commands in the current shell.
The next line, the "resty" command, sets the URL base. Thereafter, any call to GET, PUT, POST... implicitly references that base. I showed an example that emits prettified JSON. I think if your server emits minified JSON, you could pretty-print it with an external script by piping the output.
There's support for host-based preferences. Suppose your target host is api.example.org. Ceate a file called ~/.resty/api.example.org, and insert in there, lines which specify arguments that should be passed to every curl call to the host by that name. Each http verb gets its own line. So, inserting this content in the file:
GET -u myusername:mypassword --write-out "\nStatus = %{http_code}\n"
...means that every time I do a GET when api.example.org is the base hostname, the curl command will implicitly use the -u
and --write-out
args shown there. (-u for basic auth).
As another example, you could specify the Accept header in that file, so that you always request XML:
GET --header "Accept: application/xml"
Any curl command line arg is supported in that preferences file. All the curl args for the host+verb tuple need to go on a single line in the preferences file.
Handy.
lftp:
$ lftp http://repo.xplico.org/pool/
cd ok, cwd=/pool
lftp repo.xplico.org:/pool> ls
drwxr-xr-x -- /
drwxr-xr-x - 2012-02-13 09:48 main
lftp repo.xplico.org:/pool> cd main
lftp repo.xplico.org:/pool/main> ls
drwxr-xr-x -- ..
drwxr-xr-x - 2012-02-13 09:48 x
Directory listings only work for websites that do send directory indexes. But even if they don't you can still use the get
command to get individual files.