OSS: A SWOT Analysis

Our closing keynote was Eric Lease Morgan – I’ve read a ton of his articles, but have never seen him in person.

Eric Lease Morgan

Eric started with a history of him. He has been ‘kinda sorta’ writing code since 1996 1976. While driving taxi after college, he discovered his first ‘itch.’ He wanted to know how much he was earning, so he wrote a computer program that gave him all kinds of crazy stats about how much he was making – like how much per mile. He also like astronomy and had an application for his calculator that let you find out where the moon was. He thought, this should be on a computer – and so he wrote a computer program that did the same thing. Eventually he even wrote an online catalog – he made it so that he could hand out a disk and people would know everything in his library.

Which brings us to open source. Open source is about community – if there is no community there is no software – there is no support! (There are some people that seem to forget this very important fact). Speaking of support, this is the biggest challenge to open source software.

Next up the OSS SWOT Analysis

Strengths:

  • It benefits from the numbers game – chances are there is somebody out there with your particular interests. The internet makes that happen.
  • There are plenty of choices – many people are trying to scratch an itch.

Weaknesses:

  • Support is its biggest weakness.
  • OSS requires specialized skills – not necessarily programmers – but usually a systems administrator type of person to configure the application.
  • Institutions change slowly – change takes time and it often makes people nervous.

Opportunities:

  • Very low barrier to entry – computer hardware is cheap and the software is ‘free’
  • Only limited by one’s time, imagination and ability to think systematically. OSS is like a hunk of unshaped clay.

Threats:

  • Established institutions – the status quo is threatened by OSS – FUD
  • Past experience – the profession’s leadership liken OSS with the ‘homegrown’ systems of yesterday. Perceptions are slow to change.

Eric continued with his ideas on the Next Generation library catalogs. Library catalogs are and have been essentially inventory lists, but given the current environment, the problem to be solved is not find and access but use and understand. Are we about ‘here’s the book’? Is that what we’re about? What can do to take that one step further? We’ve used the computer to automate our process, but let’s use the computer to supplement who we are.

Let’s assume that content is available in digital form – this is increasingly becoming true. So once you have a book what do you do with it? Browse the TOC, check the index, put it under a wobbly chair, write in the book, analyze the content, read it…. You can read the book and get all of this – but we can provide a supplementary way of reading. So assuming our content is digital, we can take it and count the number of times a word appears in a book and then compare that to the number of times that word is mentioned in other books. So, then the book with the word in it more can be assumed to be more relevant than the other. (We’re not getting into numbers and statistics – which Eric likes – and which confuse the heck out of me).

Eric comes to the conclusion that that the availability of digital dull text provides a host of opportunities for libraries that goes beyond find and move towards use – services against text. The root of thees services grows on the ability to count the words in any set of documents.

So the next gen catalog is not just a finding mechanism but a way to understand the source material. We could add an analyze button to the OPAC and have it analyze the text saving the reader’s time by showing them if the book is actually relevant to their needs. You could add to that the ability to see how the word is used in the text. Click on the word, and then you see that word in context. This is possible using Eric’s Concordances tool.

Next, we break down the book into things like number of words instead of pages (like we catalog now) because the number of pages is ambiguous – you have no idea if it’s a long book because you don’t know how many images there or how big the font is. You can also see things like the grade range and the Flesch score (these of course are to be taken with a grain of sand because they don’t always give accurate information based on the individual person). You can see the example that Eric showed us on this record for Walden by Henry David Thoreau

Let’s imagine that this kind of metadata was in our catalog. You can search for short books for an 8th grader that has a very high great ideas (another example that Eric gave us) coefficient.

It is important to mention that Eric is not saying that this is the answer, but supplemental to what we’re already doing with our controlled vocabularies and traditional cataloging.

In the end, we have the power to do this more than Google does because we know our audience. We know our patrons.

Technorati Tags: , ,

Each One Teach One at EVG10

Jennifer Bielewski and Jenny Liberatore from Lyrasis gave the Each One Teach One talk.

They started by telling us the most important thing is to tell your attendees what the outcomes of the workshop will be – what will they learn in this workshop? We were then introduced to the SMART Training Objectives:

  • Specific
    • Be specific about what people will be learning in the workshop. Without buy-in you won’t be able to teach people.
  • Measurable
    • Give people a way to look at what they achieved in the workshop.
  • Achievable
    • Use exercises to show them that they can complete the task.
  • Realistic
  • Time Bound
Each One Teach One

You never know what people really know, so be prepared to do more training than you originally planned. It might be helpful to do something fun as a pre-assessment and a post-assessment to see where your trainees are with their skills.

When training adults you need to think about how they learn. They need high participation, a collaborative environment, give experience based examples. Adults have also already decided what’s important for themselves so you have to bring them from that to what you think is important. Adults also expect information to be useful immediately (this sounds like me – I need to be able to use what I learned right away). We also have to keep the trainee’s attention, you can do this with crazy slides (visuals), giving activities, creating games, using humor and changing it up (don’t use just one method – that said, don’t mess with something that worked the first time).

Other tips for keeping your student’s attention include:

  • Keep it simple
  • Keep training cycles short (1-2 hrs)
  • Reinforce the material with practice
  • Motivate then reward
  • Teach only correct procedures
  • Repeat until you are sure they got it

Keep in mind that different types of expertise are necessary in an institution – so the person you think might be the best trainer isn’t. Along those lines, the trainer doesn’t have to know everything (I know I don’t! If I don’t know the answer then I write down the question and research it and come back later with the answer). If staff want to train, let them give it a whirl – you never know when you’re going to find a star.

There are many different instructional strategies including:

  • lecture
  • small group exercises
  • answer questions
  • hands-on
  • demos
  • discussion
  • help apply content to job

One way that the presenters like to use the ‘Napkin Method’ when coming up with their workshops. Basically the theory is that the best ideas come about when you’re just brainstorming (napkin coming from the idea of jotting ideas down on a napkin at dinner or at the bar).

We then did an exercise using the Napkin Method – and I learned that I am no good at thinking about things like this. Which is something you have to think about. I always hated workshops that told me only 1 way to do things (not that this workshop did that!) because everyone thinks differently and organizes things differently.

Technorati Tags: , ,

SCLENDS & Evergreen

SCLENDS is a consortium of libraries in South Carolina and their talk was next up on my agenda.

SCLENDS

While it was a panel, Rogan did most of the talking and started with some background info on South Carolina: 2008 population was 4,479,800, there are 46 counties and 2 public library consortium. In June of 2008 they started evaluating their ILS, they didn’t have any major concerns with their current system, but they knew it wouldn’t be supported anymore and so they started looking to the future. So, they invited Equinox to come visit and talk about Evergreen. About 15 libraries attended the demo and what they learned was that a lot of the librarians were thinking along the same lines. At this point they were not a consortium – but they thought maybe they could form one.

10 libraries decided to step forward and join the consortium – of these they were coming from 5 different ILS backgrounds. And on May 28th, 2009 they all went live with Evergreen.

Lessons they’ve learned:

  • Front line staff can not get too much training and preparation
  • Exceptions need to be managed
    • Evergreen isn’t even at 2.0 yet, Horizon was at 7.x – which means the things they didn’t like about it were so deeply engrained that there was nothing anyone could do about it. That said, Evergreen is starting to look like an old ILS quicker than any of the non open source systems out there.
  • Public relations are critical
  • Learn to ‘think green’ in cleaning data
    • Take the time if you’re migrating to play with it and see the underlying logic – then clean your data up so that it will work cleanly in Evergreen.
  • Learn to think global and not local
    • They may have 10 different sets of circulation policies – but they are sharing materials and shipping things back and forth – so you need to think of the whole group.

13 more locations were added on October 15, 2009. It was at this time that they didn’t know what they didn’t know. That said, there was no other way to do this than to just jump right in.

And on December 3, 2009 3 more libraries joined and there was a nice distribution around the state of libraries in the consortium. At this point they learned that there a lot of duplicates in their system. This means that the patrons were asking for copies of books from other branches that their branch already had.

One of the funny things that Rogan mentioned that his libraries learned was that Evergreen is very MARC centric – meaning it actually treats MARC as standard – which other ILSes have ways to ignore. The example he gave was the Leader – he knew librarians that thought that was out-dated and not used anymore – and Evergreen uses it in searches and it will effect the way things are found!! While we all know MARC can be a pain – it is a standard and should be treated as such.

The patrons are giving feedback like:

  • “There’s sooo much more now”
  • “I love getting these books so fast”
  • “I’m still getting used to it but the staff have been wonderful”

which is awesome!! They also have patrons getting excited because they can now get items that their local library doesn’t buy (due to collection development policies) from other branches. It has opened up the world of resources for patrons.

Technorati Tags: , ,

KCLS Enhancements to Evergreen

Jed Moffit, Bill Erikson and Matt Carlson were up during our lunch to talk to us about the developments that were coming up from KCLS.

Evergreen 2010

Jed says that we have to have something in the new developments that is better than ‘sucks less technology.’ He is hoping that these developments will be genuinely of value to all of us.

Matt was up next and started with a history. He said that 2009 was the year of circulation and user interfaces. 2010 however is the year of Acquisitions, Cataloging, Serials and the OPAC.

First up was the patron registration screen. They pulled all the fields into one form so you can work on it all on one page. They also added field specific help files (that can be added in by the librarian) and the ability to auto fill fields with city, state based on the zip. Other edits were a bit smaller, but made huge improvements like better use of the screen real estate and buttons for the really common tasks across the top. (as this was being shown) someone near me kept saying ‘wow’ – so for those who are using Evergreen now, these improvements are ‘wow-worthy’).

They’ve also added a staff client activity log that shows the last X transactions performed at a staff machine so you can see what the last 5 patrons who were taken care of are and at what station. This log is only for circulation actions.

There is a new patron merge interface where you can merge patrons together and it will keep everything related to the patron in the new merged record (bookbags, holds, etc).

Moving on to catalog items. They are working on a single page to get all the info about an item that you might need. Right now you have to go to different places to find it all.

As I mentioned in my Acquisitions summary, they are working on the connection to OCLC Connexion. So catalogers can edit records in Connexion and load them into Evergreen.

On the notices front, they will make it so that overdue notices and other notices are sent out automatically – and to give staff control over how that happens.

There were a bunch of other little updates that Bill ran through quickly, that you’ll just have to wait to see :)

Technorati Tags: , ,

Evergreen Acquisitions Roundup

Bill Erickson started his Acquisitions roundup by showing us ‘Selection Lists’ – this is a concept I’m unfamiliar with. It’s basically a list of items you want to perform actions on. So you can create a list of items to order and then order from it – but you don’t have to – you can order items individually if you want.

Acquisitions in Evergreen

From a bib record you can choose to ‘show/create orders’ and this will show you a history of all orders made for that item including costs. From here you can also add items to the selection list.

From the Z39.50 search you can search for items to order from records cataloged by others and add them to a selection list from there. You can also create a brief record of your own (not the traditional marc editor, just simple questions).

Next we saw the Acquisitions search functionality and it was pretty extensive which is awesome. You can search across all different kinds of objects using the search, so you can search for line items and invoices and selection lists all at once. You can even upload a file with a batch of ISBNs (or UPCs) and search for those items to see what you have or haven’t ordered yet (very very nifty).

This was followed by Patron Requests. Patrons can log in to the OPAC and then enter in what they know about the title they’re requesting and submit that to a staging area that the staff can see. From the request list patrons can put a hold on the item for when it arrives in the library. Should you want to reject a request you can define the reasons why you’re rejecting. There are also a number of new notifications that can be sent to the patron (patron request received, ordered, rejected, item received, etc).

For individual line items you can add notes so that you can communicate among you acq librarians. So if more than one librarian works on your acq process each can leave messages for the other should they need to.

When ordering you can edit fields related to the item you’re ordering in batch or line by line. So if you have 1 title that you want 5 copies of you’ll need to assign the call number and barcode for all 5 items but you can do this in one click instead of repeating 5 times. This also means that each copy you order can come from a different fund.

After you have you order ready you can create a Purchase Order. This process is also where you get to decide if you want to add ‘on order’ items to your catalog (this process will generate temp barcodes for each item). Once the PO is ‘activated’ it can be sent using EDI to the vendor and marks the funds as encumbered. You can also create print POs for vendors that you might have to send a paper PO to.

This next part confuses me – but seems like it might be cool if I can try it out and figure it out – when the vendor sends you the MARC file you can upload it into the acquisitions module and somehow it matches everything up and puts the full marc records into your catalog. The part I’m confused about is that I think Bill is saying that you can upload a marc file without having a PO in place and it will generate one for you … feel free to comment if you know more about this.

This is being tested/actively communicated with:

  • Baker & tTaylor
  • Ingram
  • Brodart
  • Midwest Tapes
  • Book House
  • Ebsco

And lots of others are on the radar (Bound to Stay Bound, Gale Cengage, Cholastic/Grolier, Blackwell, Random House, Quality Books, Couts, Library Bound, S&D Books, Raincoast, BWI, Midwest Library Service), but haven’t been tested yet. This is what makes EDI so hard, is configuring it to work with each vendor.

Next you need to receive items you can do this for a PO or a line item. You can also un-receive – aka roll back the whole process.

For the items you can’t receive you need to set up claim policies. To do this you create a policy and then you link a series of actions to that policy. For example you can have a print materials claim policy that sends an email after 10 days late and then again after 30 days late. You can also attach a policy to a vendor, but you can override this on a per line item basis if you want. These are not managed by EDI yet, but will be.

When you are receiving you can then update the items in batch to enter in the real barcodes where the temp barcodes were generated before. Next you might have to do some merging of the records you got from the vendor over your records. The merging offers all kinds of control over what fields you keep and which you don’t want. Along with this will come Connexion Integration.

After this you’re ready for Invoicing. I have to admit – here I got a little lost – which is probably because of my inexperience with working in acquisitions. Also, Bill ran out of time because there was so so so much cool stuff to show, so we didn’t get to see it all :(

Learn more by seeing the slides here.

Technorati Tags: , ,

KCLS & Evergreen

Bill Ptacek from the King County Library System (KCLS) was our keynote with his talked entitled “From Singing the Blues to the Birth of Cool.”

KCLS serves 1.7 million people in the county where many large (many tech) companies live – Microsoft, Nintendo, Boeing, Amazon and Amgen. In addition to housing these major companies, King County is home to lots and lots of coffee (Starbucks on every corner). There are also 18 different school districts.

KCLS & Evergreen

The library has a circulation of 21.3 million (2009), which makes them the third busiest library in the US. 25% of all of their use comes from items put on hold and have delivered to their local library. Because of this (or the holds happen because of this) KCLS has an amazing delivery system that gets the books where they need to be quickly. In addition, they have 9.85 million people coming into the library. People are coming to the library and want to spend time there! They also have 26.8 million hits to their library website and 88.6 million hits to the catalog. In short, people coming in from all avenues, but the biggest reason they use us is to get to our ‘stuff’ (and KCLS spends over $13 million a year on their collections).

KCLS has had several difference library systems before choosing Evergreen. So, why did they make that decision?

  • What lead KCLS to look at open source?

    • The problem with the proprietary vendor was that they were selling ‘things.’ And the more ‘things’ they could sell the better. This means that the vendor was focused on selling and not the services – aka support. There were a number of things they wanted to do in their system that they couldn’t do. And they lived in a community with lots of high tech people saying ‘I could do this!’ but they couldn’t because they couldn’t get into the system to edit the code. Over time, after lots of discussions with the vendor always ending the same way, they decided to start looking at other alternatives. They hired people to do this for them and these people looked not only at libraries, they looked at companies that lent things. They came back and said that the library marketplace was terrible (shocker!). This company than suggested they take a look at what was happening with open source.
  • Why is open source a good option?
    • It’s the way not only the pubic sector but the private sector is going. The environment is such that people are saying open source is the way to go. More importantly, it supports the integration and collaboration. Bill says, “We don’t want this to be an ESI project, but a library project.’ meaning that they want the library to drive the development and the direction that Evergreen. And KCLS can say this because open source supports a real changed model – one different from the one that libraries have been used to for years.
  • Why Evergreen?
    • We liked the spirit of innovation and the spirit of community. It’s a product that’s driven by the people! There already was a community in place and it was a growing community (me: one thing that I always teach librarians to look for – an active community behind the open source application). And finally – it was cool!
  • Why should KCLS take on this development work?
    • They are a separate tax district and they have a board that controls everything. Second, they have great funding (75% of the households use the library)! Finally, they have a history of innovation so this just makes sense. And while the library is well funded, the decision has never been about money.
  • What are the future implications for technology in public libraries? Why is this different than before?
    • Because it is different! It’s the ability to control and manage this product. It means the flexibility to do what we need to do and respond appropriately to our patrons. “You shouldn’t have an PhD in III to be able to work with the system” (I love it!). The software has to be easier – it has to take less than 7 steps to delete a book. The patron catalog has to work and has to be integrated – it has to link up into all of the social networking stuff!

Bill concluded with a lot of ideas for the future. One that I totally agree with is that we need to get more eyes on the system. We need more librarians involved so that we can make the system better. We as librarians know what is good and what’s not good and we need to translate that knowledge to be available to our communities – and we have to use technology to get that knowledge out to the community, and this is the change.

Bill notes, “We wouldn’t be here if it weren’t for the work done by those who went before us.”

Technorati Tags: , ,

State of Evergreen

Bob Molyneux was up next with the ‘State of Evergreen’ talk.

Evergreen 2010
  • Dec 1999 PINES goes live with 26 systems, 98 outlets
    • People would actually pass over their local library to go to a PINES library instead. So PINES grew because librarians saw their patrons leaving for libraries that belonged to PINES.
  • June 2004 Lamar Veatch commits the Georgia Public Library Service to a one year test of an open source initiative
    • Bob believes that a system for a consortial environment could only have been developed in an open source environment. Bob talked about a consortium that had 2 systems with 70ish libraries in each. He asked if they had split the network and they said yes. Bob feels this is because of limitations in the software – when they change to Evergreen they will bring the 2 groups back into 1.
  • Sept 5, 2006, Evergreen goes life (46 system)
  • November 2007, Prince Rupert Public Library, Prince Rupert, BC
  • June 2008, University of Prince Edward Island
  • November 2008, Tsuga (Innisfil, Ontario, Public Library)
  • June 2009, National Resources Canada

Bob mentioned the changing nature of FUD — people used to asked “Open source? you going to use code written by a bunch of dope-smokin’ hippies?” now they are a bit more educated.

As we already know, Evergreen is the first consortial library system designed for sharing of an online catalog, shared resources and geographically spread out systems. Evergreen can accommodate libraries having their own separate policies (each physical building can decide on their own policies). I haven’t used the admin side of any proprietary ILS so I’m thinking this sounds like those systems don’t allow this kind of thing … but I could be wrong in this assumption.

The true nature of open source – developers of Koha will learn from Evergreen and Evergreen developers will learn as Koha develops new features!

Bob invented a new word – ‘superconsortium.’ A superconsortium is a group of consortia who want to work together. An example is when someone from one consortium wrote to the mailing list to ask if anyone wanted a kids OPAC and several others replied. This is how the Kids OPAC development project for Evergreen was born.

What is happening now is that they’re getting a different type of person asking about Evergreen. People know more now than they did a few years ago. This is a great thing for open source and for open source library systems!

Technorati Tags: , ,

Evergreen Developers Update

The perfect start: “1.6.04 – it’s out – it fixes bugs!”

Evergreen 2010

1.6.1 will be the next major revision release, driven mainly by the need for a booking and reservation system by one of the Evergreen libraries (I didn’t catch the name). Also, courtesy of Dan Scott, a patron password reset function via email. And of course all of the bug fixes since 1.6.0.

After 1.6.1 will be 2.0 (late summer alpha)! There will be a lot of new features including acquisitions, new circulation features, serials, and updates to BibTemplate.

Acquisitions will include all you need in an acquisitions module including EDI. The new circulation functions include updates to notices.

In addition there are some major updates made to the search functionality including truncation (which got an excited ‘yes’ from someone in the audience). It will also support termless searching – meaning you can say search for ‘sort by author from z-a’ and it will give you all the items in the catalog sorted by author descending (note that might not be how you type the query – but you get the idea). There will also be counts added to the facets so you know how many items use it and you’ll be able to click to add a facet and then click to remove the facet. Sounds like one of the features I love in Blacklight.

Technorati Tags: , ,

VALENJ: PINES & the Evergreen Open Source ILS

Elizabeth McKinney de Garcia, Program Director of Georgia PINES talked to us about how PINES decided to develop their own open-source ILS, Evergreen. Georgia PINES is made up of 49 public library systems which equates to 275 facilities and bookmobiles sharing a joint bibliographic database of nearly 9 million books.

About PINES

The PINES library card is free to residents of Georgia and can be used at any PINES library as if it were their home library. In addition, materials can be returned to any PINES library – how convenient!! ILL is available through entire system for card holders at no charge. All libraries in the system have the same policies so that patrons all have the same experience no matter what library they’re at.

In FY07 the system had more than 540,000 intra-pines loads as compared with just 6,000 in FY00. Patrons like the convenience of one system.

There is one easy to use interface across the board. Users have dramatically increased access to one centrally administered statewide combined library collection.

Time for a Change

When they looked at their contract with their vendor they found that they were writing their policies around the system (once again a reference to the culture of work arounds). In the end they had a bunch of silly policies such as how to enter a person’s name (last, first). They also found that their system was coming to a screeching halt because of the load of the users hitting the system at the same time. In short, it wasn’t meeting their needs.

After talking to nearly all the vendors they found that there really was no place for them to go – in short, they were cornered into making their own system.

Enter Evergreen

The entire development process took a little under 2 years. They had to decide where to put the line – their libraries had never been able to use acquisitions or serials so they didn’t develop that in the initial program. In short, their ILS was designed by librarians for libraries.

Georgia PINES went live in September 2006 with their new ILS, Evergreen. Unfortunately (or fortunately) the first day they had so many hits they came to a screeching halt – this was probably because of all of the press that was sent out to librarians!! :) Since then, things have been great.

Why Open Source

Elizabeth referred to open source as the difference between renting versus owning. By “owning” the software we’re responsible if the AC goes out or the roof leaks, but it’s a great place to be! We get what we need and we get what we want – don’t have to hope that in 2010 the feature we want will be up for a vote. In the end “owning” leads to an increase in control!

Conclusions

Another create example of how open source can solve a great many problems for libraries. I particularly like Elizabeth’s analogy of owning versus renting. In the end everyone owns the rights to the code behind the open source product, leading to more freedom and innovation.

I can give a personal example of this. When I was renting, I had to live surrounded by boring white walls and abide by rules like no pets and be considerate of your neighbors. Now that I own, I get have a house full of colorful walls and barking dogs!! I’m still considerate of my neighbors, but I don’t have to worry about playing music late at night or having the dogs wake up barking at 5am.

In short – owning your own place is a lot of hard work, but it leads to a more comfortable home (at least in my place).

Technorati Tags: , , ,