I have been making sitemap generators.
a little script that generates an xml file with every product, category, brand and combination thereof;
along with every news article and static page. one site has 8000 odd urls!
the sitemaps also include a timestamp of when the data was last modified.
there are about 20 sites with these auto generated sitemaps.
i used the ping urls of the search engines to notify the search engines when the new sitemaps are updated. they are all generated from a cron job.
it runs in the middle of the night and generates the new sitemaps for every site and informs the search engines that the sitemap has changed.
i still need to find the optimum time between updating the search engines and I need to wait till all the urls in the sitemaps are indexed (276 of 7985 is not a good position to be in) but the basefeed helps out with this too by driving traffic to the sites.
i will post updates. all the information you need to do this is on sitemap.org :)