Yes it is just a quick introduction. Coroutines were finalised towards the end of the book, and we were determined to get some coverage.
You can think of a co-routine as a kind of lightweight thread. When you run code in a thread you are typically using a physical resource. You will normally find that threaded code will run on a distinct core in the CPU (this doesn't *have* to happen, but typically does). That means that as you use more and more threads, you steadily start to use up the resources available for running them. After a while, threads will start to block.
Co-routines provide a more advanced way of looking at multi-tasking code. Although you can run co-routines in their own threads, by using the global scope, co-routines commonly share threads. Most tasks involved doing some work and then waiting for something to happen. When this happens in a co-routines, control of the thread is rapidly switched to another co-routine, which can profitable use of the thread while the other co-routine is not using it.
This allows you to write code that looks very similar to threaded code, but which runs on the same thread. That allows you to scale the number of co-routine in a way that you never could with threads. There is no problem running hundreds of co-routines at the "same time" in a way that you never could with threads.
You can see the co-routine code for creating a drum machine here:
The code for playing the Tom-toms and the cymbals will both run on the same thread, but appear to run completely in parallel, because they are using co-routines. It's this feature that makes co-routines incredibly powerful if you want to handled thousands of requests (for example) at the same time.
Thanks for the great question :-)
Head First Android Development