wissel.net

Usability - Productivity - Business - The web - Singapore & Twins

By Date: November 2008

What type is that blog?


The analysis indicates that the author of http://www.wissel.net is of the type:

INTJ - The Scientists

The long-range thinking and individualistic type. They are especially good at looking at almost anything and figuring out a way of improving it - often with a highly creative and imaginative touch. They are intellectually curious and daring, but might be pshysically hesitant to try new things.

The Scientists enjoy theoretical work that allows them to use their strong minds and bold creativity. Since they tend to be so abstract and theoretical in their communication they often have a problem communcating their visions to other people and need to learn patience and use conrete examples. Since they are extremly good at concentrating they often have no trouble working alone.


Check your own Blog at Typalizer

Posted by on 24 November 2008 | Comments (1) | categories: After hours

Capturing Software Requirements


In an earlier post I was recommending Alistair Cockburn's book Writing Effective Use Cases (Agile Software Development Series) . Now a new project on OpenNTF headed by Dave Parillo is implementing Alistair's approach outlined in the book. It is an early release for now, nevertheless deserves attention and support. Go get it.

Posted by on 23 November 2008 | Comments (0) | categories: SYWTBADD

Advanced DECS usage


Relational data models are a very popular abstraction used in IT. And a abstraction they are rather than a mapping. I haven't come across a relational table in real live, but only documents (the ones signed by a national bank president are my personal favorites). So while document databases, objects and attributes are a better fit to the real world, RDBMS are well understood, come with a powerful query language and are reasonably standardized. Naturally you will come across the requirement to connect Notes and Domino applications to a relational back-end. The options are plenty: ODBC (bad idea), JDBC, LCLX, DECS, LEI or DB2NSF (not mentioning the 3rd party tools). Typically I see the RDBMS connections entangled in code creating more of this. A better way to separate concerns is to remove RDBMS connections from your code and let the server handle that. In Domino you can use DECS (Domino Enterprise Connection Service) and LEI (Lotus Enterprise Integrator) for that. DECS comes with Notes since some R5 version, LEI is happily sold to you by your local IBM sales rep. I will focus on DECS for this post.
The typical DECS use is to define a data connection (tip use OLEDB not ODBC to connect to MS-SQL), a data-form mapping and import the primary keys. In the data form mapping (a.k.a Activity in DECS terminology) you set what events you want to monitor: create, read, update, delete. Typically works like a charm. Any data that is updated on the RDBMS is automatically pulled into the Notes form when opened. The biggest drawback: DECS doesn't monitor record creation or deletion on the RDBMS side (that is what besides other capabilities LEI is made for). So DECS seems to be confined to cases where creation/deletion is limited to a Domino side activity. Also DECS can't trigger stored procedures. With a little creativity however you can push the use of DECS far beyond that. I'll describe some use cases I came across where DECS was used to avoid mixing Domino and RDBMS code in a function or an agent:
  • Employee information is stored in a RDBMS as part of the HRMS. The employee ID is populated into the Domino Directory (yes it has a field for that). In an Notes application data is needed from the RDBMS. Instead of writing LCLSX code the application simply looks up the empPara document that is linked to the RDBMS using DECS. The document might not exist yet (if the user never had used that application before). If it does not exist it is created and populated just with the EmpID. When closed and reopened it will pull the employee information from the RDBMS. This is possible since the DECS task only monitors read/update. A scheduled agent removes documents when there is no more match in the Domino directory. In summary: if you can anticipate or know the primary key of a record you need, you can use DECS by not monitoring creation/deletion.
  • Based on a workflow stored procedures in a Oracle database need to be triggered. Here some work both on the RDBMS and the Domino side was done, clearly separating both. An auxiliary table was created in Oracle with an INSERT trigger, that would execute the stored procedure using parameters given in that table and write back the success of the operation. Initially it was planned to purge the table regularly (using a Domino agent deleting the documents) but then audit loved the additional documentation, so just archival was established. During the project there were a lot of changes on both ends, however the approach of using a trigger driven table proved to be very efficient to separate the two environments minimizing interference. E.g. one stored procedure would generate a unique identifier according to some obscure, constantly changing rule. By storing that result into the aux table and creating (after reading the document linked to that table) a corresponding Notes document (again create was not monitored in that scenario) application flow was seamless.
  • Enterprise parameter management [Updated] was created in a RDBMS application. Many Domino applications would need to use these parameters. Instead of having LotusScript code in every application doing RDBMS lookups DECS was used to populate and update a parameter NSF. In the activity the option "leave values in documents" was selected, so fast selections (like @DBColumn or NotesView.getColumnValues) would work. Since parameter changes happened rarely a copy of the populate keys agent from the DECSAdm database was created that would periodically shut down the activity for the parameter NSF (only the activity not the whole DECS server), pull the keys from the RDBMS based on the activity definition but only insert new keys (the original agent duplicates keys when you run it twice). Lastly it restarts the activities in DECS.
You of course can rightfully ask: why shouldn't I just use some LCLX code to connect. It isn't much more trouble. The answer is short: DECS allows separation of concern. Your Domino developers deal with what they know best: Domino. You retain the interface (and interfaces are the pieces that can make and brake an upgrade of any system) configurable outside of your own code. The whole UI for configuring and mapping has been tried and tested for years and you can open a PMR (the IBM lingo for: bug report) against it for IBM to deal with problems. You can't do that with your own code. DECS also deals with format translation and relives you from the temptation to write inefficient SQL, last not least DECS taking care of connections and connection pooling.

Posted by on 21 November 2008 | Comments (7) | categories: Show-N-Tell Thursday

Practical Magic with DXL


My Best-Practice-Track submission for Lotusphere didn't make it. However my development topic did. So I'm speaking in Orlando.
AD215: " Lotus Domino XML (DXL) allows you to extract information from your Notes database. Come to this session to learn a number of tips and tricks on how to use information to make your applications more functional and maintainable. For example, unification of the design of your views, separation of LotusScript from forms, comprehensive documentation for your applications are capabilities any developer would want. You'll also see how to use DXL to help generate XPages from existing design elements like forms and views. You'll leave this session with a number of new techniques on how to use this powerful tool. Attention: Contains Live Code!"

Posted by on 19 November 2008 | Comments (1) | categories: Lotusphere

Use Swimm Lanes to Document System Components


A common challenge in software development is to synchronize the different phases and stake holders in a development project. Business users care about the business functionality, infrastructure people about the system setup (servers, network, storage etc.), interaction designers about the UI, developers about code libraries and and and. Typically you have a different set of artifacts to document and cover the various aspects. While looking at the forest of information you might loose sight of the trees. How does a User requirement map into a story, a use case, a system module a piece of infrastructure? A neat way to show the connection between all these are swim lane diagram. Swim lane diagrams are a part of UML and typically used to show the flow between modules of a system. I'm using swim lanes to visualize application flow with the help of Sequence that allows me to type the flow rather than draw all of it. But the use of swim lanes is not limited to program flow. I have a great history book that uses swim lanes to show what happened on every continent over a time line. Back to software development. You can use swim lanes to document the development process and its components: story board, use case, feature, user experience, business process, tools and systems. Have a look at great example and the explanation around it, as well as some more thought and downloads. How do you get your story in sync?

Posted by on 19 November 2008 | Comments (1) | categories: Software

applicatio sine qua non*


If you only have money for one intranet application running on your Domino server (besides the blog and discussion that came with the server and all the goodies from OpenNTF) it would be IdeaJam from elguji Software llc. My favorite other Notes application (which runs on the client) also uses IdeaJam to collect ideas. No idea what ideaJam is? Just watch the video:

Ready to learn more? Take the tour. IdeaJam (as the time of this writing) is available in French, English, English (GB), Spanish, French, Italian and German. If you order enough copies they would make a Chinese or Bahasa version too.

\* The application you can't be without.

Posted by on 16 November 2008 | Comments (0) | categories: IBM Notes Lotus Notes

Lotus Notes Applications and other eMail Systems


Microsoft's competitive strategy with Lotus Notes is to migrate eMail to Exchange (more likely to Outlook with Exchange included as collateral damage) and to sunset or migrate applications thereafter. Looks good on paper and eMail is eMail isn't it? The hangover from the "Hooray we'll do Outlook" party comes when running the numbers on application migration. Remember: Every minute/dollar spent on migration doesn't get spent on new user requirements and productivity improvements. Their poster child Accenture, with Steve Balmer sitting on their board (would that eventually have influenced their migration decision?), took more than 6 years and still according to Vidya S. Byanna, Global Infrastructure Executive Director Accenture 200 business process supporting databases are still not migrated (nicely put by Vidya: " are in the process of migrating"). Update: Accenture has removed the blog entries, but The web does not forget! Quite interesting finding given their access to Microsoft and their claimed technical expertise. So if you have started a migration, good luck to you. Running your numbers carefully, you might end retaining and extending Lotus Domino as your collaborative platform (There are a lot companies doing just that). eMail is sooo last century. Of course the question of interoperability needs to be answered. When you build web applications, you probably already wrote a class MailNotification that generates notification eMails with http hyperlinks in the message. If you have existing (client) applications you need to deal with the way @MailSend and NotesDocument.send is working:
  • You have retained your Notes clients: Don't do anything. Connect your other eMail system to the Domino system using SMTP and the SMTP router of Domino will convert the DocLink into a hyperlink using the notes:// protocol. Of course I presume you have configured your Domino server properly (hostname anyone). If your user typically use local replicated databases the links created will point to local databases using notes:///. That can be a problem if the receiving end doesn't use a local replica. In such cases use Geniisoft's CoexLinks.
  • You web enable your application using the Domino http task (classic or XPages alike): you either need to touch all your applications and replace @MailSend/NotesDocument.send or use Geniisoft's CoexLinks. CoexLinks is an unobtrusive server tasks that does link conversion. It also takes care if users on Lotus Notes send you a DocLink. And NO. There is no magic button for web enablement. Depending on the code quality and structure of your applications it can be very easy or a little painful. See my session at Lotusphere to see what could be automated.
  • Your messages use stored forms and that forms contain actions and buttons and the like: Sorry. You need to rework these parts.

Posted by on 16 November 2008 | Comments (6) | categories: IBM Notes Lotus Notes

Supporting Notes Users in Bandwidth Challenged Environment


In a recent meeting with a client the question was raised: "How do I support users in bandwidth challenged environments". Bandwidth challenged as in GSM (no GPRS), Modem dialup, Satellite links and the like. My first instinctive answer was replication of course. Notes was around when 9600 Baud was considered fast and replication was working then. But after reflecting on the question for a while I had to answer: It depends. You have a number of options depending on your use case. In general there are two strategies to look into: a) let data transmission happen outside the user time (a.k.a in the background) b) minimize data transmission. These are the options:
  • Replication: This is the clear choice for email and informational databases like document libraries, discussions, team rooms etc. The clear advantage: while you do other things the background replication task makes sure all information reaches your desktop. It also has the clear advantage of data being available off-line. Replication is less suitable for applications where you actually only need a small subset of the whole data (typically in workflow applications) or where data transmission is very expensive. You can tweak replication settings to accommodate that. E.g. in the location "Expensive" you only receive your email (and other) database(s) while in the location Internet all is replicated (The sending of new messages is handled by the router, so it will go out). You also can limit the amount of information replicated. (See your admin help for details)
  • Mail routing: In workflow applications when requesting action or approval is is usual to just send an email with a link to the workflow document. For low bandwidth situation that could be changed to send a whole form that includes the action buttons. That form could be made part of the mail design (if it is generic enough) or could be send using "Store form in document". A decision maker would get the entire information in the inbox and can click the button (which would trigger a return mail to the mail-in enabled main application. The mail is stored there as documentation and the main document is updated.
  • Forms Bin: This is the "other" end of the Mail routing concept. A central database contains all the forms for all the workflow application used by bandwidth challenged users and the look-up configuration as far as possible. This database gets copied onto the workstation (either when they are in good network condition or via CD-ROM). Users fill out forms there, but the forms don't get stored in the forms bin but get emailed to a mail-in database that is the main application. You could add a non-replicating "personal bin" to keep local copies. This way only documents that are relevant to the user are transmitted. The forms bin replicates (probably receive ony), so updates to the forms, form removals or new forms are properly reflected.
  • Feed enablement: To get an overview on what is happening, what action is required pulling a summary through RSS into your favorite reader. While that is a read-only approach it might fit a lot of needs. Since Domino 702 there is a feed wizzard, that can generate feeds without touching your existing application. Of course you can take a peek into IBM's and OpenNTF's templates and have the RSS generated inside your application.
  • Sametime enablement: Add a Sametime BOT to your application, so users can use simple commands to retrieve or act on data there. While it is minimalist it is also frugal on the bandwidth. IBM has toolkits for Java and C++, while our business partners Botstation and Instant Tech provide libraries for LotusScript. Works great on mobile devices too.
  • MQ Enablement: This is a variation of the Forms Bin approach. Using the Expeditor Framework in the Notes8 client you can use MQ to send the data (with a little work - sample on request) it works on R6/R7 clients too. Advantage here: you application doesn't need to worry about on-line/off-line and the data transmitted is very small. Disadvantage: you need to get used to MQ (an obviously install it)
  • Web enablement: Since 4.x it has been possible to render Notes form in the browser. There is a large body of knowledge out there how to do that. Of course you want to be very light for challenged bandwidth, add compression or use XPages which does a lot of optimization for you (you want to use Firefox for its better handling of JavaScript caching)

Posted by on 13 November 2008 | Comments (3) | categories: Show-N-Tell Thursday

40 students in the class room


The school year in Singapore is over. Anthony and Ernest are home until January. Their results were good, so they will go to the two top classes. Our school admits the top 80 students into two classes with a wider and deeper curriculum. So there will be forty students in a class. 40 is a good solid number. Jesus went to the dessert for 40 days, Ali Baba had to deal with 40 robbers and it is just two short of the answer to all question of the universe.
But 40 nine year old kids in one room deems me a little high. So I did a little research. Dr. Ng Eng Hen (Minister for Education) quoted a study published by McKinsey in September 2007 entitled "How the world’s best performing school systems come out on top". According to his quote there seems to be no significant relation between class sizes and results (only 9 out of 112 studies found a positive effect). The key supposed to be the quality of the teachers. While I fully agree with the importance of teacher quality, I do have some doubt on the class size findings. The result could be a victim of a lack of ceteris paribus: When the size of a class is reduced, more teachers are needed. Since more teachers are needed, less qualified teachers are hired. Less qualified teachers lower the results. It would be interesting to take 3 equally qualified and experienced teachers and let them teach 3 classes: one with 40 and two with 20 students each and then compare. I would want to make a bet here <g>. Of course that doesn't solve the "where are all the highly qualified teachers for all that many (small) classes" question. A studyfocusing on 3rd grade entitled "Teachers’ Training, Class Size and Students’ Outcomes" and published in 2008 comes to a radical different conclusion:" the effect of class size is substantial and significant, a smaller class size improves similarly all students’ reading test scores within a class". The study confirms that the teachers' training is equally significant.
There are quite some opinions out there: 19, 24, 25 (With a legal maximum of 33) or 35 . A very promising sounding study by Neville Bennettwas behind $$$. Bennett seems to be quite an authority on the topic of learning. The question is widely debated and I can't fend of that nagging feeling that most of the studies' results are subject to the Experimenter's bias effect. I found evidence that two studies both quoted an earlier, third, study as evidence for their respective opposite conclusion. I've taught classes of different ages (12-70) and different sizes (3-30) myself and I don't think 40 is good for learning. So maybe Singapore's outstanding results are the result of world class tuition rather than than the school system.

Posted by on 12 November 2008 | Comments (1) | categories: Twins

Estimating efforts for web enablement


A lot of organizations I talk to have a lot of Notes Client applications that they want to make accessible through browsers. Some want browser only, some want dual access. All wonder how to estimate the effort needed properly. As a rule of thumb one can say: number of artifacts times time per artifact times experience of the development team. This would make one equation with three unknowns, which can't be solved. I won't discuss the "time per artifact" in this post, since this is very dependent on your technology, process and tooling used. But I will shed some light on the other two variables.
In my experience the factor skill is binary. For a guru, champion or master developer your factor is 1 (or even less), for a well experienced developer 2, for an experienced developer 4, for an entry level developer 8, for a novice 16. Of course even a novice can contribute if (s)he applies and polishes what the more experienced developers produce.
To determine the number of artifacts you would look at: forms, views, fields, columns, code events, lines of code. You easily can extract this information from a Notes database by exporting your design as DXL and then count the respective tags. Since that's a little boring let your computer do the counting. A few lines of Java will do the trick. Don't want to do that? Well, then just download this. It is sample code written in Java 6 (no Notes classes used), that will show all tag types as well as lines of code for LotusScript and @Formula. You can run it from the command line: java InspectDesign yourdatabasedesign.dxl or use it in your code.

Posted by on 10 November 2008 | Comments (2) | categories: Show-N-Tell Thursday

Documenting Project Progress


Software projects are not started and completed in a single day (well except certain voting applications for certain design partners), so you need to track progress. The typical way is to estimate the amount of work, set this at 100% and have a line (or bar) chart that shows the percentage of completion over time. You all have seen those diagrams. But they suck big time. Inevitably popular methods try to live without. So while you work like a mad dog to complete a project your progress line starts looking like on from ER room that triggers the typical "We lost him". All the change requests don't get reflected in the percentage, so you don't seem to progress. But there is a better way!
Enter the Burn Chart. In a burn chart you document the "items of work left to do", so you get a line that somewhen hits the x-axis on project completion. When change happens, you simply draw a vertical line up that shows the additional amount of work. You then conveniently can project how the additional work units move the project completion date. I've been playing with Dojo gfx to visualize such a burn chart. I opted to have fat red changes and trajectory projections. With a few finishing touches (will take some days) I have a neat function that can take in a Notes view with 2 values (Units completed, Units added) and show the real time line. I can imagine quite some projects where the progress will look like Sonic the Hedgehog.
See what is really happening in your project over time?

Posted by on 10 November 2008 | Comments (0) | categories: SYWTBADD

Domino Server and Domino System Template Versions


Every Domino server version comes with its set of system templates provided by IBM, most notably pubnames.ntf and admin4.ntf (The full list of templates is in your Administrative help file). IBM's recommendation for server upgrades is to upgrade the administration server of the Domino directory first including the templates. I recently encountered a number of customers who have concerns upgrading the address book design when they have also older server versions in their network. While it is possible to prevent replication of the design, it is more trouble than worth the effort. So let me clarify a few pointers about Domino servers and Domino system templates:
  • The Domino Directory template and the other system templates are designed to be fully backward compatible. Checking the technotes you will find that we recommend to have the latest maintenance release in place before upgrading to a new main version.
  • Domino Servers are designed to be forward compatible. A Domino server will read only configuration values from the Domino directory it understands. New parameters are happily ignored.
  • A Domino server must run with the matching version of system templates. That it does run with older template version is lucky for you, but definitely not a supported configuration. And any of the new capabilities can't be used, since you can't configure them.
  • Maintenance releases are published for a reason (Go and checkout the Fix List). IBM will not backport or provide a hotfix for problems that have been addressed by a maintenance release.
So what can you do if you are not sure what version of the templates are scattered over your Domain?
The Admin help provides the list of system templates. First you need to check if you did any customization to your version. If so, separate them as described before. Then use the admin client or a little bit of script to remove them all. You don't want to mess with different touch dates, so a bit of radical surgery is in order. Install a new server somewhere (your thumb drive is a good location) to "harvest" the original template. You don't need to generate an id or configure it. Just install and copy the NTF files from the data directory. Use the admin client to add your server and admin groups, both in their native format as well as in square brackets to the ACL (using square brackets adds these groups to all databases newly created with that templates). Copy these templates to your administrative server (pros use Replication for that). Then use the admin client to drag & drop the templates to the other servers. This will replicate them over (your adminp needs to work properly, but it does doesn't it?). You might want to do that off-hours not to get in the way of regular replication activities.

Posted by on 07 November 2008 | Comments (4) | categories: Show-N-Tell Thursday

Domino 8.5, what's in the box?


A customer recently asked: "How difficult is it to migrate from Domino 8.0 to Domino 8.5". I gently had to remind him that we upgrade while others migrate. So what's the difference: A migration often involves new hardware, while a upgrade usually happens in-place (unless of course your box is rusty and you want a shiny new one). A upgrade doesn't alter data (a lot) while a migration typically requires data conversion. A upgrade can be rolled back very fast (in Domino: just start the server from the old binary directory - you might need a little help from the compact task beforehand). A upgrade also typically coexists very well with older releases. A migration comes with coexistence challenges.
Got it? Our colleagues from the Websphere brand came up with a very interesting model for new functionality. Instead of releasing a new version altogether they publish feature packs. Taking this train of though you could look at Domino 8.5 as a combination of a maintenance release and a feature pack. It will include more than 3000 fixes since the release of Domino 8.0 and a set of new features
domino85feature.jpg
  • DAOS which stores attachments one time per server regardless of the number of users it got sent to
  • ID Vault which help you to manage that Notes.ids once and for all (I haven't come across a better PKI store yet
  • XPages which allow you to do web2.0 with elegance and ease. XPages is actually not new technology. It has been available for Websphere Portal as Lotus Component Designer for quite a while. New is the deep integration in the Domino stack (and Domino Desginer, but this is a server post)
Of course you want to check the Official Pages once 8.5 has been released.

Posted by on 02 November 2008 | Comments (1) | categories: IBM Notes Lotus Notes

The shadow of things to come


Lotus Notes and Domino 8.5 are closing in on their release date. There is a lot of buzz around XPages. But IBM doesn't stop here. Development for 8.5.1 is in full swing and planning for 9.0 has kicked off long ago. Of course (can you spell NDA) I can't tell what's happening next in the Domino space. But I can share a screenshot from my workstation (proudly running Ubuntu):
dominobuntu.png
Nota bene: I do not work in development, I don't make statements what they do or will do.

Posted by on 02 November 2008 | Comments (7) | categories: IBM Notes Lotus Notes