0% found this document useful (0 votes)
3 views5 pages

CMG performance and scalibility of a .net application

The Computer Measurement Group (CMG) is a non-profit organization focused on the performance evaluation and capacity management of computer systems. This paper discusses the challenges and techniques for verifying the performance and scalability of a .NET application with complex user interactions and a large user base. It emphasizes the importance of proper performance testing and analysis to identify bottlenecks and optimize application performance.

Uploaded by

kmdbasappa
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
3 views5 pages

CMG performance and scalibility of a .net application

The Computer Measurement Group (CMG) is a non-profit organization focused on the performance evaluation and capacity management of computer systems. This paper discusses the challenges and techniques for verifying the performance and scalability of a .NET application with complex user interactions and a large user base. It emphasizes the importance of proper performance testing and analysis to identify bottlenecks and optimize application performance.

Uploaded by

kmdbasappa
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

The Association of System

Performance Professionals

The Computer Measurement Group, commonly called CMG, is a not for profit, worldwide organization of data processing professionals committed to the
measurement and management of computer systems. CMG members are primarily concerned with performance evaluation of existing systems to maximize
performance (eg. response time, throughput, etc.) and with capacity management where planned enhancements to existing systems or the design of new
systems are evaluated to find the necessary resources required to provide adequate performance at a reasonable cost.

This paper was originally published in the Proceedings of the Computer Measurement Group’s 2003 International Conference.

For more information on CMG please visit http://www.cmg.org

Copyright Notice and License

Copyright 2003 by The Computer Measurement Group, Inc. All Rights Reserved. Published by The Computer Measurement Group, Inc. (CMG), a non-profit
Illinois membership corporation. Permission to reprint in whole or in any part may be granted for educational and scientific purposes upon written application to
the Editor, CMG Headquarters, 151 Fries Mill Road, Suite 104, Turnersville , NJ 08012.

BY DOWNLOADING THIS PUBLICATION, YOU ACKNOWLEDGE THAT YOU HAVE READ, UNDERSTOOD AND AGREE TO BE BOUND BY THE
FOLLOWING TERMS AND CONDITIONS:

License: CMG hereby grants you a nonexclusive, nontransferable right to download this publication from the CMG Web site for personal use on a single
computer owned, leased or otherwise controlled by you. In the event that the computer becomes dysfunctional, such that you are unable to access the
publication, you may transfer the publication to another single computer, provided that it is removed from the computer from which it is transferred and its use
on the replacement computer otherwise complies with the terms of this Copyright Notice and License.

Concurrent use on two or more computers or on a network is not allowed.

Copyright: No part of this publication or electronic file may be reproduced or transmitted in any form to anyone else, including transmittal by e-mail, by file
transfer protocol (FTP), or by being made part of a network-accessible system, without the prior written permission of CMG. You may not merge, adapt,
translate, modify, rent, lease, sell, sublicense, assign or otherwise transfer the publication, or remove any proprietary notice or label appearing on the
publication.

Disclaimer; Limitation of Liability: The ideas and concepts set forth in this publication are solely those of the respective authors, and not of CMG, and CMG
does not endorse, approve, guarantee or otherwise certify any such ideas or concepts in any application or usage. CMG assumes no responsibility or liability
in connection with the use or misuse of the publication or electronic file. CMG makes no warranty or representation that the electronic file will be free from
errors, viruses, worms or other elements or codes that manifest contaminating or destructive properties, and it expressly disclaims liability arising from such
errors, elements or codes.

General: CMG reserves the right to terminate this Agreement immediately upon discovery of violation of any of its terms.
Learn the basics and latest aspects of IT Service Management at CMG's Annual Conference - www.cmg.org/conference

Performance and Scalability Verification of .Net Application

Buy the Latest Conference Proceedings and Find Latest Computer Performance Management 'How To' for All Platforms at www.cmg.org
Renuka S.R
Infosys Technologies, Bangalore, India
[email protected]
Join over 14,000 peers - subscribe to free CMG publication, MeasureIT(tm), at www.cmg.org/subscribe

Analyzing and verifying performance and scalability of a new technology has


been a challenge for most businesses that adapt it during the initial phase of
its evolution. With minimal references from the vendors, the task of
performance verification becomes critical. This paper discusses experience
verifying the performance of .net architecture for a complex application with
thousands of users. The paper introduces to the quantitative techniques that
were used in verifying performance and scalability of the application.

• Complex user screens with several tabs, grids


1. Introduction containing large data.
• More than six thousand users.
A large number of businesses have critical • More than two thousand locations accessing the
applications running on technologies that are application.
obsolescent with poor prospects for support, • A database of approximately 800 GB.
interoperability and programmer availability. The
applications eventually require to be reengineered. 2. Performance Verification
Reengineering these applications needs crucial
decisions from the performance and scalability Performance testing is a good approach to verifying
perspective. In an attempt to verify the performance of the performance of a system in a multi-user scenario.
a complex client server inventory application, There are several steps involved in performance
reengineered using .net, the paper shares the issues testing – setting up the test environment, creation of
faced and the approach followed to effectively verify test scripts using a COTS (Commercially Available Off-
the performance of the application. the-shelf) load testing tool, executing the scripts
simulating a multi-user scenario and analyzing the test
Common challenges associated with performance results. The section describes the test environment
analysis of applications built on a new technology are: and details the scenarios that had an impact on the
• Vendors are not experienced with the use of the analysis.
technology for applications with complex The performance test set-up of the application is
characteristics (large number of components, users shown in Figure 1.
and data). Hence, they are unaware of performance
in such situations.
Load Testing Load Testing Load Testing
• References are limited because only a small Tool Tool Tool
number of businesses have adapted the new …
platform. This limits peer experience that can be 1,2 and 4 CPU Dell Machines
used to analyze performance.
• Unknown factors that would impact the results and .Net Server SQL Server 2000
decisions make correctness of the performance (Dell PE 1400SC DB Session State Server
2 CPU, 866MHz, (Dell PE 1400SC
verification critical. 512 MB RAM) 2 CPU, 866MHz,
512 MB RAM)
The verification and analysis techniques discussed in SQL Server 2000
the paper are applicable for any application built on Database Server
(Dell PE 8450
.Net technology. However, the inventory application 4 CPU,700MHz,
uses ASP .Net and SQL Server 2000. Listed below are 4 GB RAM)
the important application characteristics:
Figure 1 Performance Test Setup

Find a CMG regional meeting near you at www.cmg.org/regions


Learn the basics and latest aspects of IT Service Management at CMG's Annual Conference - www.cmg.org/conference

The application functionality is implemented using ASP


.Net for user interface and SQL Server 2000 as the Response Time Variation With Users

database. A separate database is used to store the


45 14
sessions of the users.

Buy the Latest Conference Proceedings and Find Latest Computer Performance Management 'How To' for All Platforms at www.cmg.org
40 12
35
10

Response Time
2.1 Increasing number of simulated users 30
25 8

Users
20 6
Load testing comprises of generating and increasing 15
4
the number of simulated users to analyze the impact of 10
5 2
multiple users on the application performance. Most 0 0
Join over 14,000 peers - subscribe to free CMG publication, MeasureIT(tm), at www.cmg.org/subscribe

COTS load testing tools can be configured to increase

10
11
12
13
14
15
16
17
18
0
1
2
3
4
5
6
7
8
9
the number of users during a test run. A load-testing Tim e
tool creates a specified number of simulated user
threads and runs the test script for each user thread Users Response Time

several times. Figure 3 Increasing simulated users gradually


.Net server (like other software servers) creates
As a result, while testing the performance using a load
software resources - components, threads and
testing tool, the time interval between increases in
database connections for every new user request and
users should ensure that the test script has run
caches these resources. Hence, the initial processing
multiple times for each simulated user.
time of the server is considerably higher for the first run
of the test script than the subsequent test script runs –
2.2 Verifying Performance Test
referred to as the server reaching a steady state.
The primary reason to conduct a performance test is to
It is required to increase the number of simulated
identify the bottlenecks in the system. If the
users only after the server has reached a steady state.
performance test is incorrectly executed, it would lead
A test conducted by increasing the number of users
to incorrect analysis and the cause of bottlenecks may
from 10 to 170 users and adding 100 users in a small
never be identified. The significance of performance
time interval of five minutes is shown in Figure 2. The
test verification is illustrated with the experience testing
response times are inconsistent with high deviations
the inventory application.
for the same number of simulated users. The results of
this test provide little insight on the performance of the
The inventory application provides a “search for
system.
inventories” operation. The ASP.Net page for this
operation consists of several user interface
The same test run with longer time intervals between
components designed to ensure that the users of the
increases in simulated users is shown in Figure 3. The
existing client screens have minimal usability issues
Response Time Variation With Users when they move to the new application. The search
operation was performance tested with the number of
180 80.00 simulated users increased from 10 to 60.
160 70.00
140
Response Time

60.00
120 50.00
The result of the test is shown in Table 1.
Little’s
Users

100
40.00
80 Request Time= Law
60 30.00
40 20.00 Number Throughput (Response time Validation
20 10.00 of Users Operations/sec + Think Time) N=
0 0.00 (N) (X) Seconds (R+Z) X*(R+Z)
10 0.46 21.07 9.69
10
11
12
13
14
15
16
17
18
19
20
21
0
1
2
3
4
5
6
7
8
9

Tim e 20 0.5 38.19 19.09

Users Response time


30 0.41 53.7 22.01
40 0.35 65.96 23.08
50 0.305 78.82 24.04
Figure 2 Increasing simulated users rapidly
60 0.31 97.35 30.17

response times are consistent and report lower values Table 1 Search Inventory results showing Little's law
for the same number of users. The results of the tests validation
indicated that a test execution should account for the The response times increased and the throughput of
time taken by the server to stabilize for analyzing the search operation reduced rapidly. The resource
performance. utilization (CPU, Disk and Memory, Network) of the

Find a CMG regional meeting near you at www.cmg.org/regions


Learn the basics and latest aspects of IT Service Management at CMG's Annual Conference - www.cmg.org/conference

servers on which the application was deployed was 1. Replacing View state with session data stored in
less than 30%. Hence, the server resources were not database reduced utilization of network bandwidth.
the cause for the performance bottleneck. View state is a .NET mechanism (hidden variable
with name value pair) by which data can be sent to

Buy the Latest Conference Proceedings and Find Latest Computer Performance Management 'How To' for All Platforms at www.cmg.org
The application logs indicated the total processing time client and back to the server. .
at the .Net server was 10 seconds while the response 2. Replacing Combo boxes with text boxes in search
time reported by the load-testing tool was 97 seconds. screens resulted in a significant reduction in page
After monitoring the application and the deployment size.
configuration, the resource utilization of the machines
running the load-testing tool was monitored. The CPU For optimizing further, code was profiled and database
Join over 14,000 peers - subscribe to free CMG publication, MeasureIT(tm), at www.cmg.org/subscribe

Utilization of the machines reached 80% - hence the was tuned to achieve acceptable performance. Some
load-testing tool was the bottleneck. of the common techniques that improved performance
are:
For other applications tested using the same • Algorithm Optimization
configuration of machines, the load-testing tool was • Low level program optimization (Use of String
capable of simulating 100 users. However, the tool Builder, reference type, structured exceptions.)
(using DOM – Document Object Model parsing • Addition of relevant indexes in the database
mechanism for running the test script) was unable to • Addition of hints to SQL Query Plan.
parse the complex ASP simulating only 20 users.

The bottleneck was easily identified using Little’s law


[1] for verification of performance test. Little’s law
4. Scalability Analysis of .Net
states that the number of users in the system is the
product of throughput of the operation (operations/sec) It is important to identify the options available in
and time taken for servicing the operation referred to upgrading the hardware configuration, to handle
as Request time – (Response time + think time). increased user base while maintaining the same
performance. Analyzing scalability requires three
The validation of Little’s law for the results indicate that metrics [3].
the product of throughput and request time fail to
match the number of users beyond 20 users. Speedup S: increase in the rate of doing work with
increase in the number of processors – ratio of the
Validating the performance test using Little’s law completion times, which can be extended to the ratio of
ensures that the required numbers of users are throughputs
simulated in the test environment.
Situations where verifying performance test helped Efficiency E: measures the work rate per processor
identify genuine problems are: and is ratio of Speedup S and number of processors
1. The load generator was a bottleneck unable of
simulating the required number of users due to the Scalability Ψ(k1,k2):from k1 number of processors to k2
complexity of application being tested and the number of processors is the ratio of Efficiencies
limitation of the hardware configuration on which It has an ideal value of unity.
the load generator is running.
2. Excessive failures of user requests owing to page Ψ(k1,k2) E(k2) S(k2)/k2 X(k2)/k2
= -------- = ---------- = ------------ …. (1)
session time outs. The session time out did not get
E(k1) S(k1)/k1 X(k1)/k1
reflected in the load generator report. However,
the results failed to comply with Little’s law.
Where, X(k) = Throughput of the application with k
processors.
Load-testing tools capabilities can be verified using
Using equation (1), the scalability of ASP .Net with
operational laws. Failing to do so would provide
SQL server 2000 was computed referring to Nile
incorrect conclusions on the applications capacity.
Benchmarks [5]. The computed results are shown
in Table 2
3. Performance Optimization ψ(K1 CPU, K2 CPU) Scalability
ψ(2,4) 0.978
Tuning and optimization are often specific to the
applications. Some generic steps taken to reduce the ψ(2,8) 0.721
size of the pages and data transferred between the Table 2 Scalability across processors
browser and the server for optimal use of the network
bandwidth are:

Find a CMG regional meeting near you at www.cmg.org/regions


Learn the basics and latest aspects of IT Service Management at CMG's Annual Conference - www.cmg.org/conference

Table 2 indicates that the architecture does not scale


for 8 Processors. The scalability obtained by scaling up
the number of processors from 2 to 8 is very low
compared to scalability obtained by scaling-up the

Buy the Latest Conference Proceedings and Find Latest Computer Performance Management 'How To' for All Platforms at www.cmg.org
processors from 2 to 4. Hence, the scale-out approach
of 4 processor servers (addition of 4 processor
servers) should be adopted to handle increase in
workload.

5. Conclusion
Join over 14,000 peers - subscribe to free CMG publication, MeasureIT(tm), at www.cmg.org/subscribe

Verification of performance and scalability requires the


correctness of a performance test execution and
analysis. The paper highlights experiences and
insights gained in analyzing performance and
scalability of a .net application using simple
techniques.

References

[1]Daniel Menasce, Virgilio Almeida and Larry Dowdy.1994.


Capacity Planning and Performance Modeling: from
Mainframes to Client-Server Systems. Prentice Hall,
NJ,USA.
[2]Edward Lazowska, et al.1984. Quantitative System
performance. Prentice Hall, NJ, USA.
[3]X.H.Sun and L.M.Ni.1993.Scalable Problems and
Memory-Bounded Speedup. Parallel and Distributed
Computing, Vol. 19, pp. 27-37
[4]Prasad Jogalekar and Murray Woodside, June 2000,
Evaluating the Scalability of Distributed Systems.
Proceedings of IEEE Transactions on Parallel and
Distributed Systems, Vol. 11, No. 6, pp. 589-603.
[5]The Nile Ecommerce Application Server Benchmark
(October 2001)
http://www.gotdotnet.com/team/compare/nileperf.aspx
[6] ]Daniel Menasce and Virgilio Almeida, 2001, Challenges
in Scaling E-Business Sites. Proceedings of Computer
Measurement Group.
[7] Deep K. Buch and Valdimir M. Pentkovski. 2001.
Experience of Characterization of Typical Multi-Tier E-
Business Systems Using Operational Analysis. Proceedings
of Computer Measurement Group.

Find a CMG regional meeting near you at www.cmg.org/regions

You might also like