One of the things I’ve been trying to do since the project itself ended is to get the site represented on search engines like Google. Naively, I thought that this was going to be the easy part of the story, but it turns out not. Earlier, I’d made the decision only to include our library catalogue records and exclude the archives and two repositories. The reason for this was that all except the library catalogue were already being crawled whereas the library catalogue had no web presence.
During the project itself, I’d installed the Drupal SEO Checklist module which is a very useful “to do” list ensuring that you have done what you need in order to optimize your SEO ranking. Some of these were very simple, such as ensuring clean urls were being used and that each page had a unique and meaningful title. Others (such as addition of structured metadata) were more complex and I hadn’t time do this.
I signed up to Google webmaster and initially things looked quite promising on release in Mid November 2012. Within 10 days, we had climbed to a whopping 58,000 indexed items (out of about 240000). This plateaued and lasted until the end of the year. Then in early January 2013, it dropped within one week to just 12,000 items and has never really since recovered beyond about 18,000. No changes had been made to the site during those weeks and therefore it was not possible to isolate the cause.
Reading various resources, it seemed possible that it might be a lack of exposed structured metadata which was holding things back. Perhaps Google had changed its indexing policy to coincide with the new year. My dilemma was that although there was structured data in the database, because I was using the Drupal Panels module to make a more user (people) friendly interface as a springboard to reaching the catalogue record, it was hiding the structured data from the search engine.
Eventually I reckoned I could work with the panels module to expose the required structured data, particularly as Google then introduced a structured data testing tool with which I could see the results of my handiwork. A couple of hours allowed me to shoehorn this together and the testing tool seemed happy with the schema.org metadata I was pumping out. I then resubmitted the site map and waited. And here we more than a month later and whilst there was a spike from 18-24,000, it has now reverted to 18,000 again so it seems no gain.
At which point, I have run out of ideas. Help! Does anyone have any more suggestions?