I have been working around CDS/ISIS exploding and my main point is about the
way Greenstone deals with ISIS database.
If you have a large database each record create one OID-HAsh wich means thousand
I would like to try another way around converting a CDS/ISIS database in one
single HTML file with tagging technique (using print format) where each record
correspond to one section. Adding metadata for searching with classifiers
Title, Author and Subject we could have a text search on sections and 3
Citando Lucia Phillip <email@example.com>:
> Hi Everyone,
> After experiencing problems moving large databases from CDS/ISIS to
> Greenstone I downloaded the latest version of the software and started
> again. I have noticed the following problems.
> 1) When we tried to explode one of ISIS databases containing links to
> images, the error 'out of memory can not parse metadata.xml' occurs. With
> the assistance of one of our I.T. consultants we have realized that the size
> of the produced metadata file in the import folder affects the successful
> execution of the next stage of the import, which is to parse that file into
> individual XML files and folders. For example, the failed
> metadata.xmlfile is 17,000 KB.
> 2) In trying to build another Greenstone collection from a large ISIS
> database in the MARC format (this one is purely bibliographic without links
> to digital files), the import stops at 1828 each time. The database contains
> 10,000+ records.
> The procedure I have used is - I would select Dublin core metadata set and
> then select explode metadata. I am then requested to either ADD, MERGE or
> IGNORE individual fields in the MARC record. I then merge some
> fields with Dublin core and add others as additional DC elements. I am
> wondering if there is a limit to the number of DC elements which can be
> Is there anyone out there who might have experienced similar problems?