Single- vs. multi-threaded programming on a single core processor
Yes, multi-threading is useful in a single core. If one thread in an application gets blocked waiting for something (say data from the network card or waiting for the disk to write data), the CPU can switch to another thread to keep working.
BeOS was written with pervasive multithreading in mind, even in a time of single core processors. The result was a very responsive OS, though a rather difficult OS to program for.
On a single core processor, an application that uses asynchronous (non-blocking) I/O will be slightly more efficient than one that uses multiple blocking threads, because it avoids the overhead of context switching between threads.
Also, asynchronous I/O scales better than blocking I/O in threads because the overhead per extra I/O operation is minimal compared to the overhead of creating a new thread.
Having said that, you shouldn't generally use single-threaded asynchronous I/O in new applications because almost all new processors are multicore. Instead you should still use asynchronous I/O, but split the work amongst a set of worker threads using something like a thread pool. Your system documentation will tell you the ideal number of worker threads; usually it is equal to the number of processing cores available.
Edit: On the Windows platform at least, the async/await pattern in .NET is the modern way to perform asynchronous I/O. It makes this pattern as trivially easy to write as the old blocking I/O pattern. There is almost no excuse for writing blocking I/O now.
There are still advantages to be gained, but they're a bit situational.
In many cases, giving the thing multiple threads will allow it to claim more system resources from other processes. This is finicky to balance, and each thread you introduce adds a bit of overhead, but it can be a reason.
If you are dealing with multiple potentially blocking resources - like file IO or GUI interaction or whatnot, then multithreading can be vital.