O(N) notation is used to express the worst-case order of growth of an algorithm. That is, how the algorithm's worst-case performance changes as the size of the data set it operates on increases.
O(log N) and O(N log N) ... generally mean that the algorithm deals with a data set that is iteratively partitioned, like a balanced binary tree... Generally, but not always, log N implies log2N, which means, roughly, the number of times you can partition a set in half, then partition the halves, and so on, while still having non-empty sets.
"We're kind of on the level of crossword puzzle writers... And no one ever goes to them and gives them an award." ~Joe Strummer sscce.org
In this case, it means inserting items into a PriorityQueue that contains n items (where n is large - these scalings are only really relevant for big numbers) scales with log(n). Let's say that you've got a queue with 10,000 entries, and it's taking four seconds to insert items (in the worst case). Then you'd expect a queue with 100,000 entries to take five second to insert items [because log(10000) = 4 and log(100000) = 5].
So it takes a day for light to pass through this glass? So this was yesterday's tiny ad?