VMWare ESXi Development Server on a Lenovo T420

For some time I have been interested in setting up a development VMWare ESXi server on a laptop to help with testing etc. The problem in the past has been that the ESXi install only comes with drivers for high end network cards and the process of adding the required drivers is chewy

As of ESXi 5.5 update 3 and ESXi 6 the drivers for the Intel 82579LM network card have been included in the core product  so as long as your laptop has > 4GB installed ( 4GB is not enough as it registers as 3.9 GB and the install fails ) you can install ESXi 5.5 or 6 on your laptop.  I only found this out after doing all the chewy stuff first 🙂

The only other serious issue that I have found is that when you are creating hard disks they must be IDE drives and not SCSI drives otherwise they will not be visible to the guest Bios.


There is a bit of a niggly issue with VSphere whereby the console does not always display. Richt clicking on the menu selecting the pop up console seems to free this up.


ESXi is free for up to two physical CPUs

XPages Document Save Error may be a document locking error

Moral of the story – read the logs

This one had us stumped for a while so I though I would post it.

We had an inconsistent error on an XPage application where it threw a save error even though the Author fields  were correct.

The error was

Exception occurred calling method NotesDocument.save()

After much head scratching followed the advice in log.nsf

please consult error-log-0.xml located in /local/notesdata/domino/workspace/logs

whereupon we discovered the message

Could not save the document 5386 NotesException: Notes error: The document is already locked by someone else.

So document locking was turned on which is not really much use in an XPage application ( we rolled our own application scope based scheme )

So I know it is a pain to connect to the server to get the log documents but it is worthwhile. I wonder could we write an XPage app to recover them ? 🙂


If you are using an Apache Proxy for Domino please check HTTPEnableConnectorHeaders

Jesper Kiær  has posted a very compelling video showing how the HTTPEnableConnectorHeaders = 1 notes.ini parameter can be used to gain access to Domino servers.

We no longer use the Apache proxy scheme as the SSL support in Domino has improved but I tested this on one of our development servers by setting HTTPEnableConnectorHeaders = 1 and using the “Modify Headers for Google Chrome” extension and was able to get access.

As Jesper notes many of the write ups about using Domino behind a proxy ( including mine ) specify using this setting. There are some useful comments to the first post in Jespers series on this issue.




Scheduling XAgents on a Linux box using Cron

We used scheduled code in our XPage apps to do things like pre-building dashboards and storing the data in the application scope.

There is no simple way to schedule Java code that has been written for use with XPages. The best solution I have found has been to use “XAgents” and to poll the corresponding URLs somehow.

I have tried this using various methods but it has not been straight forwards due to authentication issues and having to deal with SSL certificates.

The best solution I have come up with to date is to use the linux “cron” task and the wget command which is typically used to download data via the web.  This can be configured using the crontab -e command line utility or easier still by using webmin.

Step 1 – create an internet site mapped to localhost. On this site disable session authentication as it seems to cause spurious issues with automated remote calls. Also disable any settings that force traffic to use SSL.

Step 2 – from the command line in linux test your proposed wget command – something like :

wget –user=USERNAME –password=XXXX “http://localhost/apps/aw.nsf/xp_f_process_bad_actors.xsp?source=Cron”

Step 3 – add this as a scheduled task using crontab -e or webmin as shown below

@hourly wget –user=USERNAME –password=XXXX “http://localhost/apps/aw.nsf/xp_f_process_bad_actors.xsp?source=Cron”


Additional Notes

The download files when you run test downloads from webmin will be stored in /usr/libexec/webmin/cron/.. although if you schedule the task with a user such as Apache ( as shown above ) then no file is actually saved because of a permissions failure

The cron log is at /var/log/cron

Your Domino username and password will be stored in plain text in the crontab files and in the logs

You could also use bash script files and schedule these via Domino Program Documents

Thoughts for a future iteration

Making the trigger urls accessible via an anonymous user using sessionassigner somehow



Upgrading Domino causes issues with CKEditor for Chrome users

We have had a spike in help desk calls with general “oddness” in one of our XPage applications. This has been triggered by an upgrade to the Domino server that included a new CKEditor version.

It seems as though Chrome ( possibly other browsers ) cache the old CKEditor files in the browser and these are not compatible with the new server version.

Users of the XPage in edit mode see the following malformed page even after several days.



The immediate solution is to get the user to delete their cache – not an easy task nor one that presents a good impression as a service provider 🙁

Users can do this by using CTRL _+ SHIFT + DELETE and then selecting the following options


Allowing the user to use the default settings will make you very unpopular as they will loose lost of useful cached history.

I tried the simple delete cookies option but that didn’t work.

Although this post focuses on the CKEditor we have experienced other “general weirdness” after upgrades. These are usually sorted after clearing the cache. Per Henrik Lausten has posted a good scheme to minimise these issues but he says it does not help with the CK Editor issue.

Presumably that would require IBM to change the resource names for each CK editor release although it does look as though at least some of the files do have a suffix that may be for that purpose


Changing the nsf file path will not help as the resources are served centrally as the highlighted blue detail shows.

If anyone has any better ideas or understanding of this please shout. It causes a lot of hassle and presents a poor impression when you need to tell people to clear the cache – especially when you are providing an external service.