Skip to main content link. Accesskey S
  • Log In
  • Help
  • IBM Logo
  • IBM Notes and Domino wiki
  • All Wikis
  • All Forums
  • Home
  • Product Documentation
  • Community Articles
  • Learning Center
  • IBM Redbooks
  • API Documentation
Search
Community Articles > Lotus Domino > Domino deployment scenarios > Test Infrastructure : Domino 8.5 Reliability on W64 2003, AIX 5.3 and Solaris 10
  • New Article
  • Share Show Menu▼
  • Subscribe Show Menu▼

About the Original Author

Click to view profileGary Denner
Contribution Summary:
  • Articles authored: 10
  • Articles edited: 8
  • Comments Posted: 1

Recent articles by this author

Configuring and testing the IBM Lotus Domino DAOS feature

This article describes how IBM's Domino System Verification Test team sets up, configures, and tests the IBM Lotus Domino 8.5x Domino Attachment and Object Service (DAOS). We cover some of the statistics you can use to ensure DAOS is running correctly and provide a few tips on how to prevent ...

IBM Lotus Notes and Lotus iNotes 8.5.2 on Citrix XenApp 4.5/5.0: A scalability analysis

This white paper provides an overview and recommendations for how to get the most from the IBM Lotus Notes 8.5.2 client on the Citrix XenApp 5.0 server. Specifically, it shows that by tuning the environment, you can realize significant improvements when running the Notes client on XenApp. This is ...

IBM Lotus Notes 8.5 on Citrix XenApp 4.5: A scalability analysis

Published on DevWorks http://www.ibm.com/developerworks/lotus/documentation/notes/d-ls-notes85xenapp/ This white paper provides an overview of, and recommendations for, how to get the most from your IBM® Lotus® Notes® client on the Citrix XenApp server. In particular, we show that, by tuning your ...

IBM Lotus Notes and Lotus iNotes 8.5.1 on Citrix XenApp 4.5/5.0: A scalability analysis

DeveloperWorks article http://www.ibm.com/developerworks/lotus/documentation/d-ls-notes851xenapp/index.html?ca=drs- This white paper provides an overview and recommendations for how to get the most from your IBM Lotus® Notes® 8.5.1 client on the new Citrix XenApp™ 5.0 server. In particular, we ...

Test Infrastructure : Domino 8.5 Reliability on W64 2003, AIX 5.3 and Solaris 10

Domino 8.5: W64 2003, AIX 5.3 and Solaris 10 Test Configuration 1 Overview The IBM System Verification Test (SVT) objective was to execute a set of test scenarios against a test configuration that contains the key requirements and components that created a load on a mixed environment: W64 ...
Community articleTest Infrastructure : Domino 8.5 Reliability on W64 2003, AIX 5.3 and Solaris 10
Added by Gary Denner | Edited by IBM contributorAmy Smith on November 16, 2009 | Version 6
  • Edit
  • More Actions Show Menu▼
Rate this article 1 starsRate this article 2 starsRate this article 3 starsRate this article 4 starsRate this article 5 stars
expanded Abstract
collapsed Abstract
No abstract provided.
Tags: 64-bit, 8.5, AIX, Windows 2003, Solaris, mixed environment
Domino 8.5: W64 2003, AIX 5.3 and Solaris 10 Test Configuration


1 Overview

The IBM System Verification Test (SVT) objective was to execute a set of test scenarios against a test configuration that contains the key requirements and components that created a load on a mixed environment: W64 2003, AIX 5.3 and Solaris 10.

This testing used test scripts currently used by the system test team.

One's perception of system quality is governed under the statement of overall system reliability. A widely accepted definition of software reliability is the probability that a computer system performs its destined purpose without failure over a specified time period within a particular execution environment. This execution environment is known formally as the operational profile, which is defined in terms of sets of possible input values together with their probabilities of occurrence. An operational profile is used to drive a portion of the system testing. Software reliability modeling is therefore applied to data gathered during this phase of testing and then used to predict subsequent failure behavior during actual system operations

A reliability test is one that focuses on the extent to which the feature or system will provide the intended function without failing. The goal of all types of testing is the improvement of the reliability program with specific statements about reliability-specific tests. Reliability is the impact of failures, malfunctions, errors and other defect related problems encountered by customers. Reliability is a measure of the continuous delivery of the correct service (and, the time to failure).

SVT's purpose of running Reliability tests was to ascertain the following:

· Data population for all parts of the infrastructure to force set limits to be achieved and passed

· Running sustained reliability scripts at >100% maximum capacity. Assessing :

· Breakpoints

· System stability pre and post breakpoint

· Serviceability

· Forcing spikes and anti-spikes in usage patterns

· Exposing SMTP, IMAP, POP3 services to 110% of their maximum load

· Flushing out the DB Table spaces to their maximum, proving the maximum, proving ability to recover/get back to a good place when the maximum limits have been exceeded

· Proving serviceability errors and warnings when thresholds are hit

2 Configuration diagram for W64 2003 / AIX 5.3 / Solaris 10 configuration

The W64 2003 / AIX 5.3 / Solaris 10 configuration utilized the following system specifications:
System
W64 2003 machine: 8840 - xSeries 346

AIX 5.3 machine: 9113 - pSeries 550

Solaris 10 machine: Sun\SunFire V240
Processor
W64 2003 machine: 2 CPU's, 3.4 GHz

AIX 5.3 machine: 4 CPU's, 1.65 GHz

Solaris 10 machine: 2 CPU's, 1.5 GHz
Memory
W64 2003 machine: 4096 Mb

AIX 5.3 machine: 8192 Mb

Solaris 10 machine: 4096 Mb
Operating System
W64 2003 machine: Win64 2003 server, SP1

AIX 5.3 machine: AIX 5.3, OS revision 5300-08-02-0822

Solaris 10 machine: SunOS 5.10
Domino Server
Lotus Domino 8.5






The environment evaluated consisted of three machines in total; one W64 2003, one AIX 5.3 and one on Solaris 10. Each machine hosted two Domino Partitions (DPAR’s). Each partition hosted 1,000 registered users making the total NAB for the environment 6000 users. For W64 2003 and AIX 5.3, all mail files were local to the Domino servers. On the Solaris 10, the mail files were located on a NAS.

Circular Transaction Logging was enabled and each partition had two mailboxes. The transaction logs were located locally on each server for the purpose of this test.

The design task was run at the start of the test to upgrade the templates. The update task ran for the entire period of the test. Updall ran between 2:00 AM and 5:00 AM, or until it finished.

2.1 Evaluation Criteria

The performance of Domino 8.5 were evaluated under the following criteria:

· Server CPU: The overall CPU of the server were monitored over the course of the experiment. The aim for the server were for the CPU not to go above 75% over the course of the experiment allowing the server to function appropriately. It was acceptable for the CPU to occasionally spike at this level for a short period of time, but it must return to a lower level. High CPU results from the server being stressed due to processes running such as compact, fixup or replication or from user load or any other third party programs.

· Domino Processes CPU: The previous metric monitors the overall CPU of the server, however, the CPU consumption of Domino specific processes were also monitored individually. In this manner the CPU consumption of Domino specific processes were evaluated.

· Server Memory: The server memory metric represents the amount of physical memory available on the server. If the available memory becomes low the server performance could be compromised.

· Server Disk I/O: The disk is a source of contention when a server is under load and performing a high number of read and write operations. The disk queue length was measured to determine if the disk I/O operations are resulting in a bottleneck for the system performance.

· Network I/O: These metrics monitor the network utilization to ensure the bandwidth consumption is acceptable and that the network is not overloaded.

· Response Times from the End-user Perspective: The server response times for user actions represent how long a single user must wait for a given transaction to complete. This metric captures the user experience of the application with the server. At times, response times were longer when a server was under load. When response times increase over an extended period, or persist at high levels (e.g. when a database or view takes longer than 30 seconds to open), they indicate that performance indicators are being hit and detailed analysis must be performed to determine the source of the slowdown and seek remediation.

· Open Session Response Times: In addition to monitoring the individual action response times, the Open session response times were also evaluated in order to ensure the server remains responsive over the course of the experiment.

2.2 Tools

In order to simulate user activity and capture the evaluation metrics discussed in section 2.1 a number of tools were used:

· Server.Load: The Server.Load is a capacity-planning tool that is used to run tests against a targeted Domino server to measure server capacity and response metrics.

· Domino showstats data: The Domino showstats captures important server metrics. A server.load client driver may be used to execute the showstats console command at regular intervals for each server in the configuration and will provide Domino-specific data. The resulting data is logged in a text file and may be graphed for analysis.

· Open session: The Open session tool measures mail file request/response times. It will open a view of a mail database at a set time interval and record the response time in milliseconds. As a result, a server slow down may be identified by analyzing the resulting response times.

· System performance meters: For the W64 2003: perfmon. For Solaris 10: perfmeter and for AIX 5.3 TPROF. This is used to graph the CPU Usage, Disk IO Utilization and Lan Utilization.

2.3 Evaluation Process

The server.load tool places a load on the Domino server. In order to simulate realistic load on the Domino server a total of six client drivers running server.load and server.load scripts were used. (One per DPAR)

The tests run were: SMTP/POP3, 8.5 mail (NRPC load) and 8.5 mail with Sametime.(NRPC and Sametime/buddylist load) All these scripts were run for 24 hours per day for 7 consecutive days. By the nature of the script, there was a ramp up and ramp down period of one hour. This replicates a real scenario within a company.

2.4 Scenario: Online Mode

The scenario evaluates the performance of Lotus Notes Clients in online mode. Online mode means that the user mail files are stored and maintained on the Domino server. Every time a user performs an action the request is sent to the server and the mail file is modified and updated on the server side.
N85Mail and N85Mail with Sametime Script with attachment size modification
Workload Actions
Action Count per hour per user current script
Action Count per 24 hour per user current script
Refresh inbox
4
96
Read Message
20
480
Reply to all
2
48
Send Message to one recipient
4
96
Send Message to three recipient
2
48
Create appointment
4
96
Send Invitation
4
96
Send RSVP
4
96
Move to folder
4
96
New Mail poll
4
96
Delete two documents
4
96
Total Messages sent
16
384
Total Transactions
52
1248


Table 1

Table 1 shows the action workload of the built in N85Mail and N85Mail with Sametime script with modifications to the attachment size. The script reflects the workload that is expected of a single user over the course of a day.
Message Distribution in N85Mail and N85Mail with Sametime Script
Message size distribution
Percent of messages sent
Attachment size ( if any )
0 < size <= 1k
5.9%
N/A
1k < size <= 10k
66%
N/A
10k < size <= 100k
25.0%
50 KB
100k < size <= 1mb
2.8%
N/A
1mb < size <= 10mb
.3%
10 MB



Table 2

The resulting mail distribution is shown in table 2. The POP3/SMTP workload has similar distribution to the figures in the above tables. However, exact profiling was not done for this script for the 8.5 release, and thus there is no table available at this time.

3 Test drivers

The server.load workload was generated by six “driver” workstations. In addition, there were two separate drivers utilized: One for a Lotus Notes Administration client and one for running statistics collection and monitoring delays in opening databases routinely on each Domino partitioned server. Also, a Sametime 8.0.1 server was present in the test configuration in order to run the mail 8.5 with Sametime load. This server was not under any explicit load and its performance was not measured.

4 Conclusion and Summary

The test results demonstrate that the IBM W64 2003 / AIX 5.3 / Solaris 10 System configured as described in this report was able to support up to 6000 concurrent, active Notes 8.5 users with an average response time below 2 seconds.

The addition of other application workloads will affect the number of users supported as well as the response time.

Achieving optimum performance in a customer environment is highly dependent upon selecting adequate processor power, memory and disk storage as well as balancing the configuration of that hardware and appropriately tuning the operating system and Domino software.

5 Machine details



6 Configuration settings

The following notes.ini variable was added to each of the Domino Servers:

DAOSDeferredDeleteInterval=30

This deletion of NLO files is known as “pruning” and occurs at the specified “Deferred Deletion Interval.”)

DAOSBasePath=DAOS

DAOS base path, if you leave it DAOS, and your data directory is C:\Lotus\Domino\Data, the full path to the repository would be C:\Lotus\Domino\Data\DAOS

DAOSMinObjSize=4096

The minimum size setting for an attachment to make use of DAOS is 4096 bytes.

DAOSEnable=1

Enabling DAOS

DAOS_CATALOG_VERSION=3

DAOS Catalog version to be set 3

DAOSCatalogState=2

state of the DAOS catalog

CREATE_R85_DATABASES=1

Enable ODS 51 as the default

  • Edit
  • More Actions Show Menu▼


expanded Attachments (0)
collapsed Attachments (0)
Edit the article to add or modify attachments.
expanded Versions (5)
collapsed Versions (5)
Version Comparison     
VersionDateChanged by              Summary of changes
This version (6)Nov 16, 2009, 3:16:30 PMAmy Smith  IBM contributor
5Apr 23, 2009, 7:56:29 AMGary Denner  IBM contributor
3Apr 20, 2009, 1:32:12 PMPam Gilday  IBM contributor
2Apr 20, 2009, 12:06:30 PMAmy Smith  IBM contributor
1Apr 20, 2009, 4:50:07 AMGary Denner  IBM contributor
expanded Comments (0)
collapsed Comments (0)
Copy and paste this wiki markup to link to this article from another article in this wiki.
Go ElsewhereStay ConnectedHelpAbout
  • IBM Collaboration Solutions wikis
  • IBM developerWorks
  • IBM Software support
  • Twitter LinkIBMSocialBizUX on Twitter
  • FacebookIBMSocialBizUX on Facebook
  • ForumsLotus product forums
  • BlogsIBM Social Business UX blog
  • Community LinkThe Social Lounge
  • Wiki Help
  • Forgot user name/password
  • Wiki design feedback
  • Content feedback
  • About the wiki
  • About IBM
  • Privacy
  • Accessibility
  • IBM Terms of use
  • Wiki terms of use