Sounds like we're after the base concept of indexing. Without indexing, you'd probably have to read through all of your site content to find something. So if I searched for "java" you might read some static HTML pages, some database entries, some
jsp source, who knows what. Very slow indeed.
Indexing reads all of that stuff once and makes a list of what it finds and where. So if I search for "java" we could go to the index and see that "java" was found on a dozen pages and get the URLS very quickly.
Indexing can be quite tricky. What if you want to find two words that are close together, or have "party" find "parties"?
Lucene, mentioned above, is a nifty indexer. You feed your content through it once and it builds an index that can handle all kinds of slick search requests. I use it on a Wiki that lets anyone update any page any time. When somebody changes a page, I re-index just that page. The folks who built Lucene also made Nutch which can "crawl" your site and index all the pages for you, I think.
Does that go the right direction?