Metadata took publishing by storm when online bookselling became popular, and it quickly became an essential part of the industry. The Metadata Handbook by Renée Register and Thad McIlroy was first published in December of 2012, and we reviewed it here on Publishing Trends. In anticipation of the new edition coming out in January 2015, we were left wondering what’s changed since the book was published. I spoke with Renée Register about what’s new with Metadata and the new version. Renee Register is co-author of The Metadata Handbook as well as founder of DataCurate.com.
You mention in the intro that you’d be interested in hearing from countries who wanted to adapt the information in the handbook for their own purposes, has that happened?
Yes, there is now a Turkish version of the Handbook, published by the Press and Publishing Association of Turkey! We did have some interest from other countries, and hope to pursue these opportunities in coordination with the release of the 2nd edition. Although the scope of certain sections of the first edition is limited to key organizations and aspects of publishing related to industries in the U.S., the UK, and Canada, the basics apply to publishing in most locations. ONIX for Books is the international standard for communicating information about books and is widely used in Europe and Australasia, with increasing adoption in the Asia Pacific. We are still eager to work with other countries to support either a direct translation or to assist in revisions that cover specifics to the publishing industry in other places.
The Metadata Handbook as it currently stands serves as a comprehensive overview. Do you plan on expanding it further? Perhaps diving into some of the minutiae of metadata?
The 2nd edition will be published in January 2015 and includes significant revision and expansion. In addition to updates for existing chapters, there are new chapters devoted to search engine optimization, keywords, and subjects; optimizing metadata for digital publishing; and metadata for self-publishers and small publishers. The glossary, references, and vendor directory will also be updated.
Although the new edition has been expanded, it is still designed as more of an overview with links to training resources rather than a “how-to.” Over the past two years, I’ve created four online courses and three smaller, more focused handbooks in coordination with Digital Book World publishing and Digital Book World University. The classes include assignments allowing for more in-depth study and hands-on metadata creation. I’m also doing a three-hour workshop at the January 2015 Digital Book World conference.
Do you sense any shift in the way that backlist metadata is being handled?
My sense is that more publishers, especially the large publishers, have updated legacy metadata so that it can be transmitted in ONIX and works better for search, discovery, and bookselling in virtual marketplace. However, there is still inconsistency in metadata for various formats of the same content, and different workflows for conversion of backlist titles to digital and the creation of new digital titles present problems in metadata completeness and consistency.
Do you think that small and independent publishers have been more successful at integrating metadata more successfully since publication?
I’ve seen a tremendous amount of interest in metadata from small and independent publishers. Many representatives of this segment have purchased the Handbook and other publications, taken the DBW University classes, and some have reached out to me individually. There are also more vendors with services designed to help small and independent publishers create and distribute metadata in ONIX.
Is the ever increasing flood of metadata from self-published books a help or a hindrance?
It’s definitely a challenge, especially for aggregators that try to provide a comprehensive listing of available titles and booksellers that try to provide consumer (or business-to-business) access to all titles. Amazon’s self-publishing platform and some others do not require an ISBN, but assign a proprietary identifier that is little help outside of that vendor’s system. ISBN is the international standard for identifying titles, so these titles remain somewhat locked into a proprietary system. The new section of the Handbook for this segment tries to address some of these issues and offers advice on creating good metadata using the manual metadata entry systems offered by most self-publishing platforms.
You note that Amazon’s early adoption of metadata helped them gain the upper hand in bookselling. Do you think that everyone has fully caught up, or at least gotten close?
Publishers had to adapt to meet the requirements for selling on amazon.com. This involved creating processes and systems to provide robust metadata electronically and converting legacy metadata – often used mainly for business transactions and internal inventory – to metadata used by the public for discovery. Booksellers such as Barnes and Noble now have excellent websites and invest a great deal of time and resources in collecting, creating, and maintaining good metadata. Publishers today send metadata files to multiple selling partners in support of wholesale, distribution, and retail business models and platforms.
That said, Amazon remains the giant in this space and, as recent disputes between Amazon and publishers have shown, a force that is driving a lot of serious thinking about how sales channels work today. We’ve seen some publishers develop direct-to-consumer selling from their own websites. With no downstream metadata normalization, validation, or enhancement, this makes the creation of good metadata at the publisher level, and from the very first stages of the publication process, even more important.