I set up a new Taxonomy Vocabulary.
It includes a number of additonal custom fields:
2 x Term Reference (each referencing different vocabularies)
1 x Boolean
1 x Entity Reference
1 x Image
1 x Integer
I have been testing out this module to see if I will be able to use it to port the Vocabulary and it's terms from a dev to QA environment.
I've only added a couple of terms so far (for testing purposes) and the .csv contents are like so (2 items, 1 line each etc):
"Lorem ipsum","Lorem","<p>Lorem ipsum dolor sit amet, consectetur adipiscing elit. Nunc et eros et metus laoreet fermentum. Nunc fringilla iaculis nulla eu congue. Aliquam sit amet euismod purus. Phasellus imperdiet ullamcorper felis, id ultricies justo mollis ac. Suspendisse a dui vitae ligula tincidunt convallis sed at magna. Pellentesque tempus arcu id elementum sodales. Aenean vulputate, elit eu mollis volutpat, nunc magna euismod dolor, vel ornare massa tortor sit amet nibh. In hac habitasse platea dictumst. Aliquam commodo, diam sed pulvinar aliquam, ligula est pretium est, et cursus magna tellus et ligula</p>","filtered_html","0","","Ipsum","0","54284","","9","lorem ipsum"
"Consectetur Adipiscing","Lorem","<p>Integer euismod quam eget lorem blandit, ac varius libero pellentesque. Ut eget turpis sed mauris egestas lacinia viverra sed urna. Maecenas in libero eu massa volutpat scelerisque. Nulla sed euismod est, viverra egestas mauris. Suspendisse ut tortor lacinia, pulvinar nunc et, aliquam leo. Pellentesque pulvinar, orci id consectetur blandit, lacus risus tristique purus, eget tristique turpis purus a nisi. Vivamus eleifend tortor quis est posuere, vitae tincidunt metus fringilla. Morbi sollicitudin diam a dui rhoncus tempor. Duis et justo consectetur, tristique arcu et, eleifend mauris. Donec vel gravida dui. Aliquam mattis et nulla in sodales. Curabitur vehicula nunc ac fringilla fermentum. Proin semper ultricies convallis. Nam non pellentesque lectus. Sed scelerisque mattis consectetur.</p>","filtered_html","0","","Ipsum","0","54284","","1","consectetur adipiscing"
When I come to import the data, I have already set up the vocabulary first, because the data set exported has a 'vocabulary_machine_name' set - and it says in the notes on the 'What do you want to Import?' tab that it is recommended to do it this way...
So I then make the following selections (so you might be able to tell me if I'm missing anything?):
1. Under 'What do you want to Import?' i choose 'fields'
2. On this same tab, under 'Set order of items on a csv line' I put: name, vocabulary_machine_name, description, format, weight, parent, field_genre, field_hide_episodes, field_tvchannel, field_image, field_order, field_tags - basically exactly what is listed in the post-export message
3. On the 'Where are items to import?' screen I've tried this both ways, linking to the .csv or copying the contents. Either way I get the same result.
4. On the 'How is your source formatted' tab I select 'comma' and then 'quotation mark', I leave the other options alone.
5. On the 'Which vocabulary do you want to import into?' tab I select my recently added 'Lorem' vocabulary - I also make a point of ticking 'Automatically delete all terms of the selected vocabulary before import' - not that it should make any difference at this stage :P
I then click 'import'
The progress bar starts up on the next page for a second and then ends abruptly with this error after a few seconds:
An error occurred during the import.
Please continue to the error page
An AJAX HTTP error occurred. HTTP Result Code: 500 Debugging information follows. Path: /batch?id=1899&op=do StatusText: Service unavailable (with message) ResponseText:
The error page linked to displays this at the top:
Importation failed. Import process was successful until the line 1 of a total of 2. You can first check your file on this line and check file uploading.
This issue is related to import process or to size import and probably not to content. You can disable hierarchy check and reduce log level. You can divide your import file into lighter files. You can increase php and sql memory. If problem does not disappear, you can reinstall module from a fresh release or submit an issue on Taxonomy CSV import/export module.
The message above seems to indicate that something is timing out somewhere, but my file only a few bytes large so how can this be?
Interesting, despite the vocabulary displaying no terms when I look at it, if I export the data back into a .csv it clearly contains the first item:
"Lorem ipsum","Lorem","<p>Lorem ipsum dolor sit amet, consectetur adipiscing elit. Nunc et eros et metus laoreet fermentum. Nunc fringilla iaculis nulla eu congue. Aliquam sit amet euismod purus. Phasellus imperdiet ullamcorper felis, id ultricies justo mollis ac. Suspendisse a dui vitae ligula tincidunt convallis sed at magna. Pellentesque tempus arcu id elementum sodales. Aenean vulputate, elit eu mollis volutpat, nunc magna euismod dolor, vel ornare massa tortor sit amet nibh. In hac habitasse platea dictumst. Aliquam commodo, diam sed pulvinar aliquam, ligula est pretium est, et cursus magna tellus et ligula.</p>","filtered_html","0","","Ipsum","0","54284","","9","lorem ipsum"
Has anyone else experienced anything like this?
Closed: outdated
5.10
Code
Not all content is available!
It's likely this issue predates Contrib.social: some issue and comment data are missing.