isbn2bibtex
: a shell script that ties everything together, calls the various helper scripts, & manages the temp files etc.- 4 new perl modules:
MARC / libmarc-perl
,MARC::Record / libmarc-record-perl
,Net::Amazon / libnet-amazon-perl
,Net::Z39.50 / libnet-z3950-perl
(debian packages available now from the toastfreeware debian respository) - 3 perl scripts, written by me (i.e. copied from the man pages of the
above mentioned modules & adapted):
isbn2usmarc
,marcclean
,isbn2amazon2ris
- 2 binaries (
ris2xml
&xml2bib
) taken from bibutils marc2ris
, a perl script adapted from refdb
isbn2usmarc
searches for the
isbns on several z39.50 servers, marcclean
removes duplicates,
marc2ris
converts the entries from usmarc (a.k.a. marc21)
format to ris format, ris2xml
& xml2bib
finally produce bibtex output.
unfortunatley the z39.50 servers are not only hard to find & slow &
everything, those I use don't find all books I was looking for (fiction
doesn't seem to be the strong side of public & scientific libraries).
that's where isbn2amazon2ris
comes in, it tries to retrieve
informations about the missing books from amazon.{de,co.uk,com}. -
unfortunately amazon doesn't return any (publishing) address information,
though. but now I have an amazon web
services account. oh well.
well, and after all the fuzz I have bibtex files which I can open in any
(text or) bibtex editor, e.g. jabref (debian package) (and
recreate the keys in order to avoid duplicates).
now that the stuff seems to work more or less I have to decide if I really
want to
- type the isbns of all my books
- store the data in bibtex format
adapted
isbn2usmarc
to search for only 10 isbns at one time
because some z39.50 servers whined & errored out on higher numbers.
argl.
update 2:started to type in some isbn numbers. it's not really that much work especially if many books are from the same publisher (i.e. the first 2 parts of the isbn are the same) & if the isbn is printed on the back (usual for "newer" books). update 3:
with the help of
bibtex2html
I publish a list of books in my library in a daily cronjob.