Jesper Kiær has posted a very compelling video showing how the HTTPEnableConnectorHeaders = 1 notes.ini parameter can be used to gain access to Domino servers.
We no longer use the Apache proxy scheme as the SSL support in Domino has improved but I tested this on one of our development servers by setting HTTPEnableConnectorHeaders = 1 and using the “Modify Headers for Google Chrome” extension and was able to get access.
As Jesper notes many of the write ups about using Domino behind a proxy ( including mine ) specify using this setting. There are some useful comments to the first post in Jespers series on this issue.
We used scheduled code in our XPage apps to do things like pre-building dashboards and storing the data in the application scope.
There is no simple way to schedule Java code that has been written for use with XPages. The best solution I have found has been to use “XAgents” and to poll the corresponding URLs somehow.
I have tried this using various methods but it has not been straight forwards due to authentication issues and having to deal with SSL certificates.
The best solution I have come up with to date is to use the linux “cron” task and the wget command which is typically used to download data via the web. This can be configured using the crontab -e command line utility or easier still by using webmin.
Step 1 – create an internet site mapped to localhost. On this site disable session authentication as it seems to cause spurious issues with automated remote calls. Also disable any settings that force traffic to use SSL.
Step 2 – from the command line in linux test your proposed wget command – something like :
wget –user=USERNAME –password=XXXX “http://localhost/apps/aw.nsf/xp_f_process_bad_actors.xsp?source=Cron”
Step 3 – add this as a scheduled task using crontab -e or webmin as shown below
@hourly wget –user=USERNAME –password=XXXX “http://localhost/apps/aw.nsf/xp_f_process_bad_actors.xsp?source=Cron”
The download files when you run test downloads from webmin will be stored in /usr/libexec/webmin/cron/.. although if you schedule the task with a user such as Apache ( as shown above ) then no file is actually saved because of a permissions failure
The cron log is at /var/log/cron
Your Domino username and password will be stored in plain text in the crontab files and in the logs
You could also use bash script files and schedule these via Domino Program Documents
Thoughts for a future iteration
Making the trigger urls accessible via an anonymous user using sessionassigner somehow
This was a new one to me and IBM support did a great job of helping us to find the issue.
We modified a customers mail template to add some core business functionality that processed the inbound emails before they appeared in the in-box and provided some quick filing suggestions based on some business rules.
In order to present the options in the UI we had to add information to the new emails – this is where we go caught out.
If you use document.save as part of the “Before Mail Arrives Agent” then the inbox will not refresh automatically – or the refresh will be delayed. On our customers server the refresh failed in 100% of cases whereas on my dev server it was delayed 60% of the time and failed 40% of the time. For IBM support it was never delayed or failed.
When you look at the help document for the DocumentContext property it states
For an agent activated “Before New Mail Arrives,” the in-memory document is the email that is about to be delivered. Because the agent is activated instantly for each email as it’s about to be saved into the mail database, each time the agent runs you are working with a single unsaved document. The UnprocessedDocuments property in AgentContext will not return any documents for this agent type.
So if you modify it you do not need to save it – because it is about to get saved.
So we removed our document.saves while session.isonserver = true and the inbox refresh now works.
Many thanks to Djesus Rex C. Cada from IBM support who went though the past tickets and found the work around for us.
Download this case study as a PDF
1. The Challenge
PX Ltd manage the operations and maintenance of the Fellside 188 MW CHP plant providing critical process steam and electricity to the adjacent Sellafield nuclear reprocessing facility.
PX Ltd approached FoCul for a browser based solution to manage engineering change at Fellside. The system needed to provide a controlled and auditable process for managing engineering change and needed to allow engineering changes to be managed efficiently within the organisation.
The solution needed to be flexible so that lower risk changes were handled differently from higher risk changes. They also wanted a configurable solution so that workflow and risk assessments could be modified in the future.
Functionality also needed to include managing temporary modifications requiring revalidation every 3 months. Continue reading “Case Study : Management of Change for Process Safety at PX”
You can download this as a PDF.
1. The Challenge
ABB asked FoCul to help make their proprietary pRIME© (Process Reliability and Integrity Management Excellence) consulting process more collaborative, consistent and efficient using a software solution.
The pRIME© process requires close collaboration across multiple teams and it was difficult to do this well and consistently using spreadsheets. The volume of information collected in the studies and the detail of the subsequent reports also meant that the reporting phase of the project was very time consuming.
The solution needed to be used by ABB clients working on their own or alongside the ABB experts. It needed to manage, maintain and present large amounts of confidential data from sites globally and to present this information in a structured way to allow end users to make effective decisions. Continue reading “Case Study : ABB pRIME Toolkit Application”