That's not entirely where I see the use of the Prototype pattern. I tend to use it more where the default case for something might change as the application is used.
Imagine something as simple as the "font" choice in your Wordprocessor. Many wordprocessors have a similar way of working: a font chooser of some sort which does double duty - as a way of setting the default font for future typing, and as a way of chaging the font of a selected area. To change the font of a selected area, highlight it and update the "default" font style. The new "default" style is then applied to the selected area.
Let's consider a way of modelling this. Our very simple
word processor has an array of chracters, and some FontStyle objects each of which applies to a range of characters (c.f. Flyweight pattern).
It also has a "special" FontStyle object which represents the current default style for new typing (It's the same class, but it might have a null text range, for example). Let's assume the default starts at "Arial, 10pt, regular".
So I type some characters. They have to have some style, so a new FontStyle object is created as a copy of the current default style. After a few words, I pop up the Font dialog and hit the "italic" button. Now the default FontStyle object contains "Arial, 10pt, italic". The first FontStyle object which had been extending as I typed, is now closed, and a new one created and opened. Just as before, the new one is a copy of the
current default FontStyle, but it's different from the first one.
This is an example of using the prototype pattern (copying the default FontStyle object each time the font changes) rather than other approaches which might involve tracking the Font selections in other objects, and calling a complex multi-argument constructor or a sequence of "setters" each time a new FontStyle object is needed.
Did that make any sense ?