Display the totals for fines & fees, payments made, the outstanding, written off and forgiven amounts between a specified date range

A nifty single line SQL report for accounting using Koha.

Earlier this week, Sri Kalipada Jana, librarian at our client-partner Basanti Devi College, Kolkata, filed a new custom SQL report request. He wanted a report that did the following:

To list the total fines accrued, total paid so far and the total outstanding fines / fee between 02 (two) given dates.

Now, the readers of this blog who are acquinted with ready-made, user contributed SQL report library on the Koha Community wiki will know that there are a quite a few reports available that generate reports similar to Kalipada’s requirement. However these ready-made SQL reports usually generate this data at one data point at a time e.g. one report will provide total fines, another will provide total outstanding fines and so on. Further (and perhaps left as a exercise to the reader) these ready reports usually do not take into account “reversed charges“, “partial payments” entered as credits etc.

The report

The report presents a consolidated, single line of the total fines & Fees (F + FU + N + A + M + L), amount outstanding (F + FU + N + A + M + L), paid (P + C), written off (W) and forgiven (FOR) between a date range. If you wish to learn more about the mnemonics used within the brackets, you should look at the “Hard Coded Values” entry on the Koha Wiki.

By itself, the report is simple, it aggregates the totals presented by 05 (five) SQL sub-queries and shows it together. One interesting thing to note in the report is the use of “runtime variables“. The two run-time variables @FromDate and @ToDate are used to hold the user specified start and end date, instead of asking for the same repeatedly for each sub query.

SELECT * FROM 

(SELECT (@FromDate:=<<From date|date>>) AS 'From (y-m-d)', (@ToDate:=<<To date|date>>) AS 'To (y-m-d)') AS T1, 

(SELECT
    IFNULL(ROUND(SUM(accountlines.amount), 2), "0.00") AS 'Total Fines/Fees' FROM accountlines 
WHERE 
    accounttype IN ('F', 'FU', 'N', 'A', 'M', 'L') AND  DATE(timestamp) BETWEEN @FromDate AND @ToDate) AS T2, 

(SELECT 
    IFNULL(ROUND(SUM(accountlines.amountoutstanding), 2), "0.00") AS 'Total O/S' FROM accountlines 
WHERE 
    accounttype IN ('F', 'FU', 'N', 'A', 'M', 'L') AND  DATE(timestamp) BETWEEN @FromDate AND @ToDate) AS T3, 

(SELECT
    IFNULL(REPLACE(ROUND(SUM(amount),2),"-",""), "0.00") AS 'Paid / Credited'  FROM accountlines 
WHERE
   accounttype IN ('PAY', 'C') AND description NOT LIKE "%Reversed%" AND DATE(timestamp) BETWEEN @FromDate AND @ToDate) AS T4, 

(SELECT
    IFNULL(REPLACE(ROUND(SUM(amount),2),"-",""),"0.00") AS 'Written off'  FROM accountlines 
WHERE 
    accounttype='W' AND DATE(timestamp) BETWEEN @FromDate AND @ToDate) AS T5, 

(SELECT
    IFNULL(REPLACE(ROUND(SUM(amount),2), "-", ""), "0.00") AS 'Forgiven'  FROM accountlines 
WHERE 
    accounttype='FOR' AND DATE(timestamp) BETWEEN @FromDate AND @ToDate) AS T6

Cheers!

L2C2 Technologies extends technological support to KLA for open access journal

L2C2 Technologies to provide OJS software hosting and maintenance along with publishing support to Kerala Library Association for KJIST for the next 5 years.

Earlier today the first open access (OA) LIS journal from Kerala, India – “The Kerala Library Association Journal of Information Science & Technology” (KJIST) was launched with the release of its inaugural issue – Vol. 1 No. 1 July 2018 by Dr. Rajan Gurukkal (Vice Chairman, Kerala State Higher Education Council) at the DLIS Hall, Kerala University Library Building, Thiruvananthapuram.

As the first open access LIS journal being published from Kerala, all articles published in KJIST will be available online at https://www.kjist.in for reading and download. As the technology support partner L2C2 Technologies is providing the journal publishing software hosting & maintenance along with publishing support to Kerala Library Association for KJIST, initially for a period of 5 years. The journal is being hosted using the latest stable version of Open Journal Systems (OJS) journal publishing platform.

While readers are not required to register at the site to read the articles, registered users will receive automatic updates about articles and issues as and when they are published. The privacy policy is GDPR compliant and all personal data submitted by the readers, authors, reviewers and editors are maintained on servers geographically located in India. Mandatory SSL has been implemented for the site, so that when users interact with the site, e.g. when they login or read or download articles, all their transactions remain securely encrypted end-to-end at all times. L2C2 Technologies will also be providing KLA with in-depth usage statistics and data analytics for the online Journal site.

About KJIST

KLA Journal of Information Science and Technology (KJIST) is published twice every year (January and July) in open access (OA) and print formats. It will include original papers, short communications, book reviews and letters pertaining to library science, information science and related fields. The papers included will focus on aspects of exploring, applying, and evaluating new theories and ideas to develop and modernize libraries and enhance library services. It is meant for library professionals, documentation and information professionals, researchers, students and others interested in the library and information science field.

About Kerala Library Association

Formed in 1972 by a group of librarians who strongly felt the need for an independent professional association at the state level to promote the cause of library development, professional standards of librarians, entirely dedicated for libraries and librarians, the Kerala Library Association (KLA) is headquartered at Thiruvananathapuram, the capital of the state of Kerala, India.The association has boldly voiced the sentiments and the views of professionals on several occasions to the concerned authorities. It has successfully projected the need for improving library services in College & Universities. KLA helped the Government to formulate policies in relation to library development in the state.

Chinmaya University partners with L2C2 Technologies for Koha support

We are pleased to extend a warm welcome to the newest member of our growing client-partner family in Southern India – Chinmaya Viswavidyapeeth – a Deemed to be University under UGC, with campuses in Kochi and Pune. The CVV library, which is self-hosted, has gone live on Koha ILS 18.05.04, with server management and Koha maintenance being handled by L2C2 Technologies.

About Chinmaya Viswavidyapeeth

Inaugerated on 9th July, 2017 at Kochi as the Chinmaya Vishwavidyapeeth University for Sanskrit and Indic Traditions, de novo deemed university under the UGC, presently operates from two campuses – the Chinmaya Eswar Gurukula campus at Peppathy in Ernakulam and Chinmaya Naada Bindu Gurukula campus at Kolwan in Pune. It presently offers courses in a number of traditional and contemporary streams.

The university’s motto can be summed up as “विद्यया रक्षिता संस्कृतिः सर्वदा । संस्कृतेर्मानवाः संस्कृता भूरिदा:” (translated:Knowledge protects culture forever; Cultured people share abundantly) in the words of Swami Tejomayananda, founder, Chinmaya Viswavidyapeeth.

Surendranath Law College partners with L2C2 Technologies to automate their library

136 year old Surendranath Law College selects L2C2 Technologies’ Koha support.

We are pleased to extend a warm welcome to the newest member of our growing client-partner family in Eastern India – Surendranath Law College, Kolkata. At 136 years, SNLC is regarded as one the oldest law college of the country. The SNLC library has gone live on Koha ILS 17.11 series on L2C2 Technologies‘s cloud hosting platform.

About Surendranath Law College

Surendranath Law College formerly known as Ripon College) is an undergraduate law college affiliated with the University of Calcutta in Kolkata, India. It was established in 1885 by a trust formed by the nationalist leader, scholar and educationist Surendranath Banerjee, a year after he founded Surendranath College. This is now regarded one of the oldest Law college of the country. Dr. Rajendra Prasad, the first president of the Republic of India is among the several notable alumni of the college.

Source: Wikipedia

AskALibrarian – Koha’s new feedback form plugin

A Koha plugin that adds user feedback functionality to Koha’s OPAC.

What is the ‘AskALibrarian’ Koha plugin?

It is a Koha plugin that adds a modal user feedback form to the Koha OPAC. It is written to utility the Koha Plugin System that was originally introduced into #kohails with version 3.12. This means that while the feature is an extension to a stock Koha installation, it is essentially a koha-native solution. Thus it is well-integrated within Koha as compared to any other third party solution. In terms of direct advantages it brings in (a) ease of maintainability across Koha upgrades, (b) ability to use the submitted data directly from Koha’s SQL reports, (c) data privacy – the data is stored directly inside your Koha database, as opposed to with any other 3rd party solution (something that assumes far greater significance in the GDPR era).

Why this plugin?

The Koha OPAC provides a lot of functionality to both the library staff as well as the library patrons. Below is a short list of things you can (there are more) and things you can not do.

Things you can 👍 Things you can’t 👎
  • Login with your user account
  • Reserve a book
  • Renew a book
  • Pay a fine or fee
  • Update your contact information
  • Do a self-registration
  • Submit a purchase suggestion
  • Add star ratings
  • And much more
  • Leave a feedback for the library staff

Since a lot of L2C2 Technologies’s client partners seem to be wanting the facility of user feedback forms, we finally decided “Why not… let’s do a plugin!” And so the AskALibrarian plugin was born.

How does it work?

The plugin adds a bootstrap modal feedback form as a menu option on the navbar at the top of the Koha OPAC. On submission of the form by the user / visitor, the information is submitted via an AJAX call to a Perl script – askalibrarian.pl, on the staff client side. The script invokes the required sub-routines from the plugin to do two things : (a) send the user an acknowledgement of the submitted input to the user’s email address, it provided is reachable and (b) store the submitted data in the koha_plugin_com_l2c2technologies_askalibrarian_feedback table in the Koha database. A callback sends up an alert popup on the OPAC, informing if it has successfully captured the data.

How to get the plugin?

The plugin is published under GNU GPL v3.0 license and is available here from L2C2 Technologies‘s GitHub repository. Pull requests for bugs or improvements are welcome 🙂

Please show me step by step!

Given how our readers from the Indian sub-continent are fond of “step-by-step” instructions, we have a small video for our readers (about 11 mins) that shows how to setup and use plugin.

Enjoy! May the source be with you!

Koha Patron Image Packer helper script

One-click utility to help build bulk patron image loader zip archive for users accessing Koha ILS from their Windows PCs.

Academic libraries like the ones in colleges, typically have a need to bulk upload patron images (of students) at the start of every school year. The photos are usually made available from their admissions office, named and numbered as per either unique enrollment id or student id of each student. The college libraries typically also utilize these same numbers as their library card numbers.

Of course, when several hundred patron images have to be imported, Koha’s bulk patron image upload option comes in very handy. It is available at More > Tools > Patrons and Circulation > Upload Patron Images. We can create a ZIP archive with all these uniquely named and numbered photos along with a manifest file named either DATALINK.txt or IDLINK.txt. The structure of the manifest file is simple – <cardnumber><separator><filename>. The <separator> can be a comma or a tab character.

The only kink in this work flow is that quite often we would see people creating that manifest by hand. This, as expected, quite often led to errors, and as a result they often had to make multiple runs at uploading the photos along with wasting precious time identifying the errors in the manifest.

So when our client-partner Basanti Devi College Library wanted to upload several hundred patron images at one go, we decided to put together a small PowerShell script that could automate this function so that it could get done quickly, without any error and at the click of the mouse.

The Koha Patron Image Packer is the outcome. It is small PowerShell script that allows users working on Windows based system to build their bulk patron image batch loader zip file along with it’s IDLINK.txt manifest with just a single click. It’s minimum requirements are PowerShell v 3.0 script with .NET Framework 4.0.

Get the code

The code is available from https://gitlab.com/indradg/koha-patron-image-packer

Installation

Installation is nothing more than copying the datalinker.ps1 file over to folder which is holding your patron images named as per their cardnumbers in Koha.

However, before your Windows system may allow you to run an PowerShell script, it may be require you to set your ExecutionPolicy to RemoteSigned for your CurrentUser. To do that, go to Run and type in PowerShell and hit enter to open up the Windows PowerShell interactive console and type in the following command:

Set-ExecutionPolicy -Scope CurrentUser -ExecutionPolicy RemoteSigned

To check if this has been set currently you can run the command and check the output by typing in

Get-ExecutionPolicy -List

Demostration

A screencast of this feature at work is available on Youtube. It is a short video under 2 minutes. Watch it at 720p at full-screen if supported on your bandwidth.

License

It is released under GNU GPL v3 or later, the same license as Koha.

Acknowledgment

A.J. Tomson, Librarian, Devagiri College, Kozhikode who wrote an early version of analogous script in Python in Koha 3.12 days.

Vimal Kumar V. whose blog post “A useful script to compress patron images to upload” had originally put the spotlight on Mr. Tomson’s work.

RTFM Series : Memcached and “DBI Connection failed: Access denied for user [..]”

Stumped by the Koha v 18.05’s refusal to access the database after a koha-remove followed by a koha-create using the same instance name? Then read on!

This post applies strictly to Koha 18.05 which was released on May 24, 2018. The new version with its new features has created a lot of excitement among the user. However the version has some major changes and unless you **CLOSELY** read and understand the release notes you will asking for trouble.

In this post we are going to talk about Memcached which is turned on by default from 18.05. If you overlook this fact, you may see yourself wasting hours trying to troubleshoot a problem, which may inexplicably (not quite even though it looks that way) go away after re-starting your system.

Last weekend my young friend Jayanta Nayek spent nearly a day trying to understand why he was getting the error – “DBI Connection failed: Access denied for user” whenever he tried to access the web-installer part of the staff client on his new 18.05 instance. Since he was on a test system, he had followed the old rinse-and-repeat routine. So when his installation did not work out for the first time, he ran sudo koha-remove library and then re-ran sudo koha-create --create-db library to start afresh.

When he re-created the instance i.e. “library”, he was stumped with the following error :

Software error:

DBIx::Class::Storage::DBI::catch {...} (): DBI Connection failed: Access denied for user 'koha_library'@'localhost' (using password: YES) at /usr/share/perl5/DBIx/Class/Storage/DBI.pm line 1490. at /usr/share/koha/lib/Koha/Database.pm line 103

Of course, when he tried to access the koha_library database from the mysql command-line client using the user and pass from his /etc/koha/sites/library/koha-conf.xml, it worked perfectly. But if he came back to the browser and tried to access the web-installer, the error would return.

So what was happening here? In one word – memcached! Memcached is an open source, distributed memory object caching system that alleviates database load to speed up dynamic Web applications. The system caches data and objects in memory to minimize the frequency with which an external database or API (application program interface) must be accessed. (Source: What is Memcached?).

In simple terms what this means for Koha is that memcached caches the frequent database queries fired off by Koha. And if an SQL result set has not changed since it was last queried *and* is already stored into memcached, it offers the data from in-memory hashes rather than using a more time-consuming database lookup process. Memcached (along with plack is intended to make Koha work faster under heavier loads).

When Jayanta would run sudo koha-remove library and followed it up with sudo koha-create --create-db library the Memcached server was not restarted, and kept holding on to the *original* database access hashes. Whereas following koha-create command to re-create the instance, the database authentication credentials (user & pass values from koha-conf.xml) were changed. That is why when he tried to directly access via the mysql command line client, it worked as the CLI client did not know about memcached. But when he tried to access the web-installer it failed as the connection query hash offered up from memcached had the old and not longer existing credentials.

So, if any reader of this blog should find themselves facing a problem like this, simple run the command sudo service memcached restart and once memcached has restarted, access the web-installer. It will work this time. Since memcached is an in-memory storage, the restart clears up the hashes and when the web-installer tries to access the database, it leads to what is called a “cache-miss” and thus the queries get run against the actual DB using the access credentials stored in koha-conf.xml.

And for goodness sake READ THE RELEASE NOTES 😉

The Mysterious Mr. Z in Z39.50 ;-)

An explainer about what “z” in z39.50 stands for all the confused souls.

27 years back, in 1991, Subhash Ghai released a movie named – Saudagar. The signature tune from its original soundtrack was a song – “ILU ILU.. YEH ILU ILU KYA HAIN? ILU ILU? (‘ILU ILU… What is this ILU ILU?’)“.

In case you are wondering about the context of this song, ever since NECOPAC z39.50 service went live last month, the most repeated common question coming from LIS students and professionals on FB has been – “What does ‘Z’ stand for in z39.50?”.

For some, it is a question that popped up in their head when encountering z39.50 search in Koha. For others, especially from West Bengal, India, apparently this is a question that is being asked at the currently on-going interview for WBHRB.

Some wondered that it perhaps stood for the company that started z39.50, while others had no idea.

So what is z39.50?

In very simple terms Z39.50 is a communications protocol for searching and retrieving information from a bibliographic information database over a TCP/IP computer network. It is covered by ANSI/NISO standard Z39.50, and ISO standard 23950:1998. The standard is maintained by US Library of Congress.

Cataloguers mostly encounter z39.50 when they attempt to do copy cataloging. Copy cataloging is a process of fetching and editing a pre-existing bibliographic record from a z39.50 server instead of creating a completely new record from scratch. Thus helping to save time, effort and therefore money, while bringing in a certain standard in cataloging quality.

Ok! Just tell me what “Z” stand for!

Asking what “Z” represents is actually asking the wrong question. The correct question to ask is What does Z39 stand for?.

The short answer

On its own, Z39 simply refers to the American National Standards Committee Z39. By itself “Z” has no special meaning. In the present context, Z39 refers to NISO standards related to publishing, bibliographic and library applications in the United States of America, all of which start with “ANSI/NISO Z39.”.

Towards the end of this post a few example NISO standards have been listed.

The long answer

To understand we have to look back at the history of standardization process as it happened in the United States of America during the last century.

Exactly 100 years back in 1918, 5 engineering organisations and 3 federal organisations came together to form the American Engineering Standards Committee (AESC). In 1928, AESC re-organised to form the American Standards Association (ASA). In 1966, ASA became the USA Standards Institute, followed by a further transformation in 1969 to become the American National Standards Institute or ANSI as we know it today.

The centenary video from ANSI describes the journey of standardization in United States and its global impact.

It was during the ASA years that formal standardization of librarianship started to take shape.

Image source: The Legacy of a Librarian: Carolyn Ulrich’s Little Magazines

In 1935 Carolyn F. Ulrich of New York Public Library led the initiative to create a standard for arrangement of periodicals that became known as Z29.1-1935.

In 1937, prompted by various library associations, ASA appointed Ulrich to represent ASA on International Standards Association’s (ISA) Committee 46 – an international committee on documentation.

This further led to the organisation of a national committee on library standards in June of 1939. The committee was simply named as “Committee Z39” and was tasked with setting up

“Standards for [library] concepts, definitions, terminology, letters and signs, practices, methods, supplies and equipment.”

Over time it came to be known as the “American National Standards Committee Z39“. In 1984, it changed its name and structure to become the National Information Standards Organization (NISO). NISO today continues to develop, maintain and publish technical standards related to publishing, bibliographic and library applications in the United States of America as an ANSI accredited SDO (standards designator organization). All NISO standards all start with “ANSI/NISO Z39.” (read as zee or zed thirty nine dot).

To cut a long story short, z39.50 is simply the 50th NISO approved standard

Example of NISO standards

If you wish to explore further into the world of NISO standards, please do visit the NISO standards tracker for active standards.

Featured image is from the document “Task Force on American National Standards Committee Z39: Activities and Future Directions” published in 1976. The full-text of this historical document is available here.

Generate a single sheet custom MARC21 framework in 2 minutes

For intermediate Koha ILS users who wish to quickly generate a single tab MARC framework.

Last Thursday, Ashish Kumar Barik, librarian at our new client-partner Midnapore City College filed a support ticket asking for a custom single sheet MARC21 framework or what is more commonly referred to by LIS professionals as a “worksheet“. He wrote that he wanted the following tags 000, 003, 005, 008, 020, 040, 041, 044, 082, 100, 245, 250, 260, 300, 440, 490, 500, 504, 650, 700, 942 and that the sub-tags/fields should be set as in the default marc framework shipped with Koha. We promised him his new framework. Being new to this side of Koha, he of course had missed out two key fields without which his system would be rendered practically useless i.e. the two local use tags952 and 999. Koha uses 952 to handle holdings (item) information and 999 is purely an internal tag used to track the bibliographic records.

Now anyone who has ever setup a new MARC framework knows that it can be a laborious and time consuming task. Further, there are chances of introducing inadvertent human errors that may lead to error or bad data when used as a part of the framework. As a result, at L2C2 Technologies we have developed several well defined strategies to manage custom marc frameworks for our clients. In today’s blog, we are going to share the simplest of the techniques we use in cases like this. The outcome of this exercise is a 100% error free marc framework generated in less than 2 minutes.

LEGAL DISCLAIMER: The next steps involve directly accessing and making changes in the the Koha database. So use these instructions at your own risk, if you face any data loss, corruption or system errors we are not responsible.

The Steps

  1. We used a regex capable editor like Notepad++ to quote the fields mentioned by Ashish, so that 000, 003, 005, 008, 020, 040, 041, 044, 082, 100, 245, 250, 260, 300, 440, 490, 500, 504, 650, 700, 942 became ‘000’, ‘003’, ‘005’, ‘008’, ‘020’, ‘040’, ‘041’, ‘044’, ‘082’, ‘100’, ‘245’, ‘250’, ‘260’, ‘300’, ‘440’, ‘490’, ‘500’, ‘504’, ‘650’, ‘700’, ‘942’. And while we did that, we also added the following fields missing in his list i.e. ‘952’, ‘999’.
  2. Next we defined a new framework MCC1 (MCC Framework) by visiting Home -> Administration -> MARC bibliographic framework -> New Framework
  3. Next we copied the default framework into MCC1 as its base, since that is what Ashish had wanted. At this point, the MCC1 framework is exactly same as the default framework of Koha.
  4. Next we fired up the MySQL console and logged in with the user id and passwd from MCC’s koha-conf.xml, and chose Ashish’s database in this case koha_mcc for the next steps.
  5. Fired the following SQL query :
    UPDATE
       `marc_subfield_structure` 
    SET
       tab=0
    WHERE 
       `frameworkcode`='MCC1' 
    AND 
       `tagfield` IN ('000', '003', '005', '008', '020', '040', '041', '044', '082', '100', '245', '250', '260', '300', '440', '490', '500', '504', '650', '700', '942')
    AND
       `tab`!=0;

    MySQL client told us 152 rows were affected.

    EXPLANATION: This moved all 1XX to 9XX (except 952 and 999) marc fields into Tab 0. The images below help illustrate the condition after this step:

  6. The next step was to set the rest of the fields outside the list supplied by Ashish *plus* 952 and 999 to be ‘ignored’ by Koha when using the MCC1 framework. And thus the following SQL query:
    UPDATE 
        `marc_subfield_structure` 
    SET 
        `tab`='-1' 
    WHERE
        `frameworkcode`='MCC1'
    AND 
        `tagfield` NOT IN ('000', '003', '005', '008', '020', '040', '041', '044', '082', '100', '245', '250', '260', '300', '440', '490', '500', '504', '650', '700', '942', '952', '999') 
    AND
        `tab`!=0;

    This time MySQL reported that 3416 rows were updated.

  7. Our last step at the MySQL command line was the following query that removed the unwanted 0XX fields from Tab 0 :
    UPDATE 
        `marc_subfield_structure` 
    SET 
        `tab`='-1' 
    WHERE
        `frameworkcode`='MCC1'
    AND 
        `tagfield` NOT IN ('000', '003', '005', '008', '020', '040', '041', '044', '082', '100', '245', '250', '260', '300', '440', '490', '500', '504', '650', '700', '942', '952', '999');

    MySQL reported 341 rows were affected.

  8. Coming back to MCC’s Koha staff client, we did the most important thing i.e. running MARC Bibliographic framework test. The test came out clean without any error.
  9. That’s it! MCC’s custom MARC framework is ready for use. Click on the image below and then zoom in to see the details up close.

NECOPAC : a new z39.50 service from North Eastern India

The launch of NECOPAC the first volunteer-run, freely available, public z39.50 service in North Eastern India.

On the auspicious occasion of Rongali Bihu (for the Assamese) and Poila Boishakh (for the Bengali), on behalf of the NECOPAC team, I’m happy to announce the start of NECOPAC z39.50 service. This is the first freely available, public z39.50 (copy cataloging) service in North Eastern India. As on date there are 92,333 bibliographic records in the database, all of which are volunteer contributed. We are expecting more records to be contributed soon.

Origin of the project

The germ of the idea of NECOPAC started about 2 years back during a chance meeting of a group of young like-minded library professionals and technology specialists in Guwahati, Assam.

The north eastern Indian is home to 8 states – Assam, Manipur, Tripura, Meghalaya, Mizoram, Nagaland, Arunachal Pradesh and Sikkim. The region boasts of great cultural heritage and diversityr. Libraries are an important means of preserving and disseminating this rich heritage, traditional knowledge, literature and other creative output.

Freely available copy cataloging of Indian publications using z39.50 service, to this day, remains largely a distant dream for most Indian library professionals. The scenario in the North East is even more difficult when it comes to publications in the local languages from this part of India.

The NECOPAC z39.50 service is a volunteer-run collaborative attempt at bridging this service gap. It is the first freely available, public z39.50 service in the entire North Eastern India. For us to grow bigger and be able to serve more LIS professionals in the NE, we need your active support and contributions i.e. a copy of your bibliographic records.

DISCLAIMER : The bibliographic records are provided on AS-IS basis by this service and the NECOPAC team does not vouch for the correctness or accuracy of these records.

Connecting and downloading records from NECOPAC

You will need to setup your z39.50 client using these details:

Parameter Setting
Server name NECOPAC
Hostname z3950.necopac.in
Port 9999
Database biblios
Syntax MARC21 / USMARC
Encoding UTF-8