Show me the money…. er.. the barcode?

Fixing the issue of checked out items barcodes’ not showing up for patrons logged into the Koha OPAC.

The Problem

Earlier this morning, Pranab Roy who manages the Library at the Karnavati University‘s UnitedWorld School of Business‘s Kolkata campus popped up on WhatsApp with a question – “Sir, why is Koha not showing the barcode / accession number of a borrowed document when an user logs in into their own account via the OPAC?”

This is what he meant. And this is actually the expected behaviour, with Koha doing exactly what it was asked to do. I could understand his confusion since the OPAC search’s details view showed the barcode quite nicely, then why not for the users themselves?

A bit of backstory

Karnavati University Libraries had shifted to L2C2 Technologies‘ cloud platform from a pre-existing Koha instance maintained by a 3rd party. As such the system had quite a few issues. While we had fixed a larger number of these during the initial on-boarding stage, some of these are getting ironed out only now as the librarians hit these “bumps on the road”. Pranab’s problem was one such.

The Solution

The option to display the barcodes for a logged-in OPAC user’s checkouts (issues) is driven by the SHOW_BARCODE patron attribute. In Koha, patron attributes or more correctly ExtendedPatronAttributes are library-defined custom fields that can be applied to patron records e.g. voter / aadhaar card number, registration number etc.

SHOW_BARCODE is a boolean variable that is defined as either Yes or No. and it is loaded into Koha usually during the web-installer phase of Koha’s installation from the optional SQL dump file: /usr/share/koha/intranet/cgi-bin/installer/data/mysql/en/optional/patron_atributes.sql in Debian package based installations. The tiny file contains a single SQL INSERT statement:

INSERT INTO `borrower_attribute_types` (`code`, `description`, `repeatable`, `unique_id`, `opac_display`, `staff_searchable`, `authorised_value_category`) VALUES (‘SHOW_BCODE’, ‘Show barcode on the summary screen items listings’, 0, 0, 1, 0, ‘YES_NO’);

In the case of Karnavati University this optional patron attribute was not imported during the *original* installation done by the 3rd party support provider at the time. And without “SHOW_BARCODE” being set, Koha had no way of displaying the barcodes of checked out books to patrons logged in via the OPAC.

In the end, the following two lines followed by enabling the ExtendedPatronAttributes system preference cleared off the issue for Karnavati:
$ cd /usr/share/koha/intranet/cgi-bin/installer/data/mysql/en/optional
$ mysql -uroot -p koha_karnavati < patron_atributes.sql.

A memcached restart later (to be safe rather than sorry) the OPAC started showing the barcodes to logged-in patrons.

Getting LE certbot-auto to work on an aging Debian 7.x

Stuck with Debian 7 in 2020 and need certbot-auto to work? Here’s how we did it.

Yes, it is 2020 and it is very late in the day to be still using Debian 7.x. But you just may have a piece of critical infrastructure that is still running on that Debian 7 box and moving it may not be an immediate possibility. Your infrastructure component also happens to use LetEncrypt certs for SSL. Your certificate has just expired and you ran certbot to issue a new certificate. And BAMMMM! you hit this!

Replacing certbot-auto…
Creating virtual environment…
Installing Python packages…
/opt/eff.org/certbot/venv/bin/python: No module named pip.__main__; ‘pip’ is a package and cannot be directly executed
Traceback (most recent call last):
File “/tmp/tmp.BLzjDMi7yW/pipstrap.py”, line 177, in
sys.exit(main())
File “/tmp/tmp.BLzjDMi7yW/pipstrap.py”, line 149, in main
pip_version = StrictVersion(check_output([python, ‘-m’, ‘pip’, ‘–version’])
File “/usr/lib/python2.7/subprocess.py”, line 544, in check_output
raise CalledProcessError(retcode, cmd, output=output)
subprocess.CalledProcessError: Command ‘[‘/opt/eff.org/certbot/venv/bin/python’, ‘-m’, ‘pip’, ‘–version’]’ returned non-zero exit status 1

Well, we did. The problem stems from the fact that with certbot-auto version 0.32 it stopped working with EOLed Linux distributions. This has hit distros like Debian 7.x that EOLed towards the end of 2018 which also dropped official certbot support. Debian 7.x (wheezy) uses an ancient version of pip that cannot be run as a module (python -m pip). And hence the mess.

This is how we got over it for the moment (it is Jan 2020 at the time of writing)

1. rm -rf /opt/eff.org

2. Download old 0.31 version of certbot-auto so that we can get around the version issueswget https://raw.githubusercontent.com/certbot/certbot/75499277be6699fd5a9b884837546391950a3ec9/certbot-auto

3. chmod +x ./certbot-auto

4. Run certbot-auto with the necessary switch ./certbot-auto --no-self-upgrade

And this was the result

Bootstrapping dependencies for Debian-based OSes… (you can skip this with –no-bootstrap)
Hit http://archive.debian.org wheezy Release.gpg
Hit http://archive.debian.org wheezy Release
Hit http://archive.debian.org wheezy/contrib Translation-en
Hit http://archive.debian.org wheezy/main Translation-en
Hit http://archive.debian.org wheezy/non-free Translation-en
Hit http://archive.debian.org wheezy/main amd64 Packages
Hit http://archive.debian.org wheezy/non-free amd64 Packages
Hit http://archive.debian.org wheezy/contrib amd64 Packages
Hit http://archive.debian.org wheezy/main i386 Packages
Hit http://archive.debian.org wheezy/non-free i386 Packages
Hit http://archive.debian.org wheezy/contrib i386 Packages
Reading package lists… Done
Reading package lists… Done
Building dependency tree
Reading state information… Done
gcc is already the newest version.
python is already the newest version.
python-dev is already the newest version.
python-virtualenv is already the newest version.
openssl is already the newest version.
libffi-dev is already the newest version.
libaugeas0 is already the newest version.
libssl-dev is already the newest version.
ca-certificates is already the newest version.
augeas-lenses is already the newest version.
The following packages were automatically installed and are no longer required:
libapache2-mod-fcgid libcarp-assert-more-perl libcarp-assert-perl libcgi-compile-perl libcgi-emulate-psgi-perl libdevel-stacktrace-ashtml-perl
libfcgi-procmanager-perl libfile-pushd-perl libfilesys-notify-simple-perl libfreeradius-client2 libhash-multivalue-perl libhtml-lint-perl libhttp-body-perl
libmodule-refresh-perl libplack-perl libtest-longstring-perl libtest-requires-perl libtest-sharedfork-perl libtest-tcp-perl rt4-apache2 rt4-clients rt4-db-sqlite
Use ‘apt-get autoremove’ to remove them.
0 upgraded, 0 newly installed, 0 to remove and 155 not upgraded.
Creating virtual environment…
Installing Python packages…
Installation succeeded.
Saving debug log to /var/log/letsencrypt/letsencrypt.log
Plugins selected: Authenticator apache, Installer apache

Which names would you like to activate HTTPS for?
– – – – – – – – – – – – – – – – – – – – – – – – – – – – –

Shared in the hope that it may help someone else in a similar position.

Zara hatke, zara bachke… yeh hai DataTables Meri Jaan!

A short tutorial on identifying and fixing DataTables errors arising from missing data in Koha ILS

The Problem

Recently we fielded support call from Parama (Sarkhel)-di, Librarian at Ramakrishna Sarada Mission Vivekananda Vidyabhavan. Her complaint – for a particular faculty member she was not able to see the member’s checkouts, instead it showed “Loading” and then nothing happened, even though the system was showing there were 9 items checked out to her.

RKSMVV being cloud hosted, we simply punched in the specific member’s cardnumber and then clicked on “Show checkouts” button. And voila! the error was right in front of us. Experience told us that it looked like a typical DataTables error.

What is DataTables

DataTables is a JQuery plugin for displaying information in HTML tables and adding interactions to them. It provides searching, sorting and pagination without any configuration. If you wish to learn more about it, given that Koha makes good use of the plugin, please visit DataTables examples index for a quick start.

Debugging the error

Since we expected the error to a DataTables error, our first step was to check our browser’s JavaScript console. And sure enough there was an error that said that it was triggered when a NULL value was passed to the escapeHtml() function at line number 285 inside Koha’s checkouts.js JavaScript library.

We still needed to know *what* exactly was passed to the escapeHtml() function. For that we clicked on the link to the right which pointed to line number 285 inside the checkouts.js file. As the debugger’s sources tab opened the file around line number 284, it became immediately clear the exact error. One of the checked-out items did not have a barcode assigned.

Now it was just the matter of finding out *which* of the 9 items checked out to the member did not have a barcode. First, we ran a SQL query on the instance’s issues table with the member’s borrowernumber to retrieve the itemnumbers of the checked out items, and then using this list of itemnumber we queried the items table to find out which of the 9 items had a missing barcode. The result was self explanatory.

By cross-referencing the biblionumber attached to the itemnumber we opened the offending item holding record in the edit mode from the staff client and for the time put in “FIX_BARCODE” as the temporary placeholder barcode. Immediately the member’s account showed up the table of checked out documents correctly. The member was requested to temporarily return the book so that the barcode may be fixed.

But why did it happen?

The book was lent out to the member several years back from, what is now a very, very, ancient version of Koha. At that time DataTables plugin was not a norm. About an year back the version was moved to latest version of Koha and the database updated. The error was triggered now, because this was the first time this specific member had come back to the client to borrow a book. Had she tried to issue or return a book before, the error would have been caught much before.

Since we fixed the error, we also checked the entire database of any such other cases. And sure enough there were 3 more books issued to 2 other faculty members like ages back which too did not have any barcode assigned. Parama-di noted the numbers down so that these books could be recalled back and their barcode updated.

Pro-Tip to avoid such errors

For people moving very old versions of Koha to newest versions, please run SQL queries to ensure that your all your items table’s items have homebranch, holdingbranch, itype and barcode are correctly assigned rather than having NULL or whitespace before you move the updated database into production mode.

Gotcha! GoogleIndicTransliteration may not work if your Internet connection is slow!

Cross domain scripts may fail to load over slow links, reducing functionality and impacting user experience.

Indian libraries on Koha ILS often use the GoogleIndicTransliteration setting to provide Indian language search support to the OPAC search. The feature works by having your Koha sending a request to the Google’s servers to dynamically load the API code into the OPAC page and execute it, so that it is ready to translate what you type into the language you have selected for transliteration into. So basically you must have an active Internet connection on the system from which you are using the OPAC if you want transliteration to work. However as it turns out, that may not exactly be enough!

Yesterday Sujan Saha messaged me asking if the GoogleIndicTransliteration setting was somehow missed out during setup of a particular OPAC instance. The screenshot collage above shows what he expected (on the left) versus what he got (on the right). Nope! the setting was very much there, so what was the case?

Turns out he was trying to access the OPAC over a particularly slow (that day) Vodafone 3G connection. So, while rest of Koha would load and execute even over the slow link, Google transliteration didn’t, in fact, it vanished from his screen. Since it is JavaScript, the best way to debug it is to turn to the console on your browser.

NOTE: If you are on Google Chrome, you can find it under More tools -> Developer tools option on the menu.

In this case, the console clearly shows the following warnings:

kaboom

So basically a slow network link combined with a couple of cross origin parser blocking script (the Transliteration API code) is the reason why he saw no option for transliteration.

Quod erat demonstrandum!

MarcEdit QuickTip #2 – Unicode in your source file

Converting a batch of multi-lingual bibliographic records stored in a MS-Excel worksheet to .mrc using MarcEdit? Be sure to check your charset is set to UTF-8 while saving the spreadsheet or its CSV export.

Naveed Bhatti, a fellow Koha ILS user from neighboring Pakistan pinged last week over a problem he was facing. He had multi-lingual bibliographic data stored in an MS Excel worksheet. He wanted to use MarcEdit 6’s Delimited Text Translator tool available under the “Add-ins” menu to convert this file into an Unicode (UTF-8) encoded MARC21 (.mrc) file so that he could import the records into Koha.

However, when he attempted to generate the .mrk file (MarcEdit’s intermediate MarcBreaker mnemonic format before export to .mrc) instead of seeing the Arabic script, he saw a bunch of “?????? ????? ???” wherever there was text in the Arabic script in the spreadsheet. Naveed thought he must be missing something small but crucial, perhaps a setting.

naveed_02

I had a hunch, but I wanted to check the data before commenting. So I asked for a few sample records, which he sent over the next day. I checked and found it was a simple case of charset conversion glitch at the spreadsheet end of things. I could easily generate both the incorrect as well as the correct output (see above) with a simple change of the charset filter. I was using LibreOffice Calc on Windows 8.1 and the default export charset was *not* set to Unicode (UTF-8). The case being, with the default export charset the exported / saved file did not contain the correct Unicode codepoints in the data for the non-Latin data. As a result, at MarcEdit’s end, it became a simple case of garbage-in-garbage-out instead of receiving the correct non-Latin data.

The screenshot below shows the correct filter to use if you are using LibreOffice Calc. If you are using MS-Office, you should see something similar.

naveed_03