wissel.net

Usability - Productivity - Business - The web - Singapore & Twins

Testing Notes and Domino applications


Chuck Norris doesn't test. For the rest of us: you need a plan when preparing to test Notes applications. Traditionally Notes application testing has been an afterthought. After all adding a few fields onto a form and creating a few columns in a view was all you needed for an entry level Notes application. Like the frog that gets slowly boiled often we don't recognise the point when applications grew so big that a more rigorous test regime started to make sense. While TDD is all the rage for code driven development it hasn't taken a strong hold in the Domino world that is very visualy driven.
A lot of times the first consideration about a systemic test only arises when you want to upgrade a server: Will all my applications run? Do I need to change anything?
In general Notes and Domino are fiercely backward compatible, so the odds are with you. However the odd changes (like you used variable names that became class names) and subtle alterations in behaviour will require that you at least have a look.
Now testing an existing application that has been build anyhow is a complete different animal from building an application suitable for automated testing. It is like building a house with proper heating and plumbing vs. retrofitting this into an existing building - just ask your average castle owner. So let us have a look at the two use cases:
  • Testing an existing application

    This is the trickier use case since your application most likely deeply connects your user interface with your data back-end. So a few steps are necessary to perform tests.
    • Know your code: using Teamstudio Analyzer, DXLMagic, Ytria ScanEZ or OpenNTF's Source Sniffer you can get a feel for the code base: where are your agents, what @Db* functions are at work, where are external system calls, how complex is your code and where have developers committed a crime by omitting Option Declare. Teamstudio's Upgrade filters provide a handy way to get the pay-attention-to-this list when planning to test for upgrades
    • Prepare a base line version of your application. Base line doesn't mean empty, but fully configured with sample data (you want to test "this value gets edited and changed" too). ZIP away the base line, so you can repeat the test
    • Prepare an environment that reflects your use cases. The two main fallacies in testing are: using a small database and a fast network (and then act surprised when user complaint about performance of the large database on the slow network). So you might end up having a base line with a lot of data preconfigured. Using Apache's TCPMon you can simulate a slow network connection even in your exclusive single user gigabit test network. (There are more)
    • Write a test plan. Depending who (or what) is executing it, it can be short sentences or elaborate instructions. Make sure to include edge cases where values are close to a limit, exactly hit it or just exceed it a little.
      A Notes database is a good place to keep a test plan (I don't mean to attach a text document into a Notes database, but having a proper form with all test data). Test plans ensure that you don't miss something. Test plans are living documents. Your test plan WILL be incomplete, so be prepared to update them frequently (this is why every test deserves its own Notes document (with versioning)). When you look at a test plan you will find striking similarities to Use Cases. They are actual natural extensions. While the Use case describes what the user (or system) does, the test plan describes how it is done in detail
    • Pick a test tool that can replay testing sequences. Here it gets a little tricky. IBM doesn't offer a tool for Notes client testing. There is server.load, but that's mostly suitable for mail performance testing only. Luckily the Yellowsphere stepped in and SmartTouchan's Autouser fills the gap. It allows for both functional and performance testing. When you test Domino web applications your selection is wider. You need to distinguish between functional and performance testing:
      • Performance:

        Here you can test the raw Domino performance by requesting and sending http packages from/to the Domino server or the all over performance including your specific browsers JavaScript performance. Typically you go for the first case. Here you can start with JMeter or Rational Performance Tester (there are others, but my pay cheque doesn't come from there)
      • Functionality:

        Top of my list is Rational Functional Tester (same reason as above), but I also like Selenium which nicely runs in your Firefox. There are almost infinite combinations of browsers and operating systems, so to no surprise you can find a number of offerings that do the cross browser testing chore for you. Some of them can run against your intranet application, so you don't need to open up your applications to wild west.
      There is no magic bullet application, Testing is time consuming and comes with a learning curve (guess what a lot of interns do)
    • Run the tests in various environments (if you test for upgrades) or before and after code changes (if you test for performance or regression avoidance)
    • Be very clear: The cost for test coverage is growing exponentially and you can't afford 100%. A good measurement is to multiply the likelihood of an error with the economic damage it can do. If you spend more money on testing, you are wasting it. (Of course that is a slippery road if an application error can lead to physical harm, this is where I see the limit of "assess economic damage only")
    • Your test plan should start with the most critical and most used functions and then move to the lesser used and critical actions. Repeat until you run out of time or budget.
  • Building applications for testability

    Your all over test cycle gets shorter and more efficient when you design an application for testability from the very beginning. The approach is called "Test Driven Development" (TDD). In a nutshell: you write your tests before you write code (which will make them fail), then you write code until the test goes through. This works well for, well code. In a highly interactive visual environment like Domino designer that is much harder. Very often business logic is hidden (pun intended) in hide-when formulas. You (partially) need to let go of such ideas. Some pointers:
    • Put a business logic into script libraries. You can write test methods that call these functions more easily
    • Have a mechanism that generates your test data, so you can start over with a test series anytime
    • Use the XPages unit tests on OpenNTF
    • Abstract your logic from its implementation. So instead of writing @DbLookup(@DbName,"configLookup","departments",2) you write systemConfig.getDepartments() or at least systemConfig.getConfig("departments") and implement the actual data access in a library.
    • There is much more to say.... somewhen
As usual YMMV

Posted by on 18 May 2011 | Comments (1) | categories: Show-N-Tell Thursday

Comments

  1. posted by bruce lill on Friday 20 May 2011 AD:
    The problem is not having comments to help. The last example, the @dblookup is easy to figure out. Which SSJS library is the systemConfig in? it could be loaded by any one of the custom controls on the Xpage.
    If the requirements or design document were actually written and stored in the app it would help.
    I just go with my own notes and Selenium for testing the web side. I've yet to have a customer willing to spend 20% more for documentation.