Skip to main content
    Country/region select      Terms of use
     Home      Products      Services & solutions      Support & downloads      My account     

developerWorks  >  Lotus  >  Forums & community  >  Best Practice Makes Perfect

Best Practice Makes Perfect

A collaboration with Domino developers about how to do it and how to get it right in Domino

Oh my ears and whiskers!There are several different aspects of best practices that I want to explore, but the most popular one is making it fast. And the most typical performance-related task is processing a whole lot of documents in an agent.

The first thing I ask myself (or ask whoever is looking for answers, if I can manage to horn in on the deliberations) is, do you really have to process all those documents? Can't you do a search to zoom in on just the documents that need to be changed? Do you really need to change those documents? Because, while it seems obvious that no work is done faster than the work you don't bother to do, I'm frequently amazed at the amount of unnecessary work that does get done. It's the antithesis of slack, and thus, a great evil.

The archetypal situation is a scheduled agent that deletes every document in a database and recreates them from some data store, overnight -- an extreme case but unfortunately common. While this is an easy agent to write, ease of implementation is really its only advantage. From every possible aspect of performance and usability, it stinks. Consider:

  • You have to modify twice the number of documents (all the old ones you delete, and all the new ones).
  • All those deletion stubs inflate the size of the database.
  • The next time someone uses each of your views, it must be worse than totally re-indexed. See, if the view index isn't already stored, Notes has to test every document to see whether it matches the selection formula and insert it into the index. But your situation is worse than this -- the index already exists. Notes has to do the same work of looking at the new documents, plus examine every new deletion stub, and remove the former document it represents from the view.
  • If there are replicas, the next replication after your agent runs takes forever. This is especially fun for people who travel with local replicas on their notebooks.
  • Doclinks and URL links to documents in this database don't work because the document they point to doesn't last. The link someone sent you yesterday is no longer valid.
  • Unread marks are useless.
The number of documents usually increases over time, so the performance problems become worse and worse. Often, the limiting factor is the available time to run the agent (which depending on your exact coding techniques, may increase at a rate greater than O(n) ). In other cases, you may discover the problem when a posse of irritated users shows up at your office, saying they had some time on their hands while they waited for a view to re-index, and thought they would use that time to visit you. These sorts of unexpected occurrences disturb the Zen-like calm we are striving to cultivate.

One solution for the Delete And Replace All Syndrome is shown in the "Replicate Pirates" agent in the Dick Tracy sample database. This tests the data in the two sets of records to see what's changed, and only modifies Notes documents as actually needed. This is written so as to be easy to adapt to other cases where you must synchronize data between something else and Notes.

But also, it pays to be creative about the problem. Do you really have to store the data in the Notes database at all, or can you have the data looked up in real time when the application is used, perhaps with LEI or DECS? Do you need all the data, or just the last month's records? Is there a timestamp or list of recently deleted entries that you can access to speed up the process (if not, can you arrange for there to be)?

The key consideration, again, is do you really have to mess with all those documents? Even reading a document is more work than you want to do unnecessarily. In future posts I'll discuss more ways to speed up accessing them if you really do need to.

Andre Guirard | 1 March 2007 03:11:56 PM ET | Plymouth, MN, USA | Comments (1)


1) Processing Lots of Documents - Must You?
Simon Boulton | 3/7/2007 10:11:34 AM

Andre - thanks - a good clear explanation of basic concepts leavened with humour. I all too often forget the basics when I've developed my latest sexy app . . . I'll be back for more!

2) Processing Lots of Documents - Must You?
Ben Langhinrichs | 3/8/2007 10:13:46 AM

Good post. I agree that the best way to save time is to not do the work that doesn't need doing, but it is easy to forget.

 Add a Comment
Comment:  (No HTML - Links will be converted if prefixed http://)
Remember Me?     Cancel

Search this blog 


    About IBM Privacy Contact