We use XML feeds from our Domino applications to pull data into Excel. Users can right click on the data tables and they get updated from Domino.
The functionality generally works well but if you need to add a new “column” to your XML feed there is no way in Excel to pull this into the data table. You can either start again and pull in a new table or edit the XML files that make up the Excel file. If you already have lots of charts set up then starting again is not an option.
I needed to add a new column today so I thought that I would document what I had to do.
1) save the xlsm file as *.xlsm
2) change the file name to *.zip
3) unzip the contents
4) edit the file “.\xl\xmlMaps.xml” as shown below
5) rezip the file
6) change the file type back to *.xlsm
7) open the spreadhseet
8) use the developer ribbon bar to open the XML source panel
9) drag the new column onto the worksheet
Just a quick heads up about something that is known about but still catches me out from time to time. It may even be fixed in the latest versions but I can’t find any IBM tech notes on it.
If you have a scheduled agent that is set to run “More than once a day” and “All Day” then when you rebuild it from Source Control the “All Day” property has changed to between 00:00 and 00:00 – which effectively means never.
The solution is to set the “All Day” agents to 00:01 – 23:59 and source control will preserve these values.
I wanted to move to a text based changelog in our applications rather than a Notes form or even an XPage.The idea was that it would help make it easier to deal with merge conflicts in source control.
I eventually settled on using Markdown as I couldn’t get the formatting that I wanted with ascii. Continue reading “Displaying Markdown in an XPage using the showdown.js library”
We used scheduled code in our XPage apps to do things like pre-building dashboards and storing the data in the application scope.
There is no simple way to schedule Java code that has been written for use with XPages. The best solution I have found has been to use “XAgents” and to poll the corresponding URLs somehow.
I have tried this using various methods but it has not been straight forwards due to authentication issues and having to deal with SSL certificates.
The best solution I have come up with to date is to use the linux “cron” task and the wget command which is typically used to download data via the web. This can be configured using the crontab -e command line utility or easier still by using webmin.
Step 1 – create an internet site mapped to localhost. On this site disable session authentication as it seems to cause spurious issues with automated remote calls. Also disable any settings that force traffic to use SSL.
Step 2 – from the command line in linux test your proposed wget command – something like :
wget –user=USERNAME –password=XXXX “http://localhost/apps/aw.nsf/xp_f_process_bad_actors.xsp?source=Cron”
Step 3 – add this as a scheduled task using crontab -e or webmin as shown below
@hourly wget –user=USERNAME –password=XXXX “http://localhost/apps/aw.nsf/xp_f_process_bad_actors.xsp?source=Cron”
The download files when you run test downloads from webmin will be stored in /usr/libexec/webmin/cron/.. although if you schedule the task with a user such as Apache ( as shown above ) then no file is actually saved because of a permissions failure
The cron log is at /var/log/cron
Your Domino username and password will be stored in plain text in the crontab files and in the logs
You could also use bash script files and schedule these via Domino Program Documents
Thoughts for a future iteration
Making the trigger urls accessible via an anonymous user using sessionassigner somehow
This was a new one to me and IBM support did a great job of helping us to find the issue.
We modified a customers mail template to add some core business functionality that processed the inbound emails before they appeared in the in-box and provided some quick filing suggestions based on some business rules.
In order to present the options in the UI we had to add information to the new emails – this is where we go caught out.
If you use document.save as part of the “Before Mail Arrives Agent” then the inbox will not refresh automatically – or the refresh will be delayed. On our customers server the refresh failed in 100% of cases whereas on my dev server it was delayed 60% of the time and failed 40% of the time. For IBM support it was never delayed or failed.
When you look at the help document for the DocumentContext property it states
For an agent activated “Before New Mail Arrives,” the in-memory document is the email that is about to be delivered. Because the agent is activated instantly for each email as it’s about to be saved into the mail database, each time the agent runs you are working with a single unsaved document. The UnprocessedDocuments property in AgentContext will not return any documents for this agent type.
So if you modify it you do not need to save it – because it is about to get saved.
So we removed our document.saves while session.isonserver = true and the inbox refresh now works.
Many thanks to Djesus Rex C. Cada from IBM support who went though the past tickets and found the work around for us.