Hello, I am trying to use Sashimi on a .csv file of abstracts (plus other metadata) from Web of Science. I successfully uploaded the zipped .csv, and the Data Parser seems to have worked correctly. However, when I run the Prepare Corpus script for Sashimi, I get the error below. When I look at the job log, it says “No logs found”. Thanks for your help!
Prepare Corpus did not finish successfully. See the job log for additional information. To obtain help with this
issue, ask us at the forum and include the following text: job id: 446924
Ni! Hi Dan. There was a mishap in our infrastructure. It should be working now.
Let us know if you still find any issues.
Also, had you been using sashimi before? If so, we’d be interested in hearing what you think about the new “workbook” format that replaced the old HTML tables.
Best!
Hi Ale! I’ve just recently started working more with Sashimi, so I can’t really comment on the new format. The “prepare corpus” script seems to be working now, though! I was also able to run some domain-topic models. However, when I tried to create a domain map, I got the following error:
Domain Maps did not finish successfully. See the job log for additional information. To obtain help with this issue, ask us at the forum and include the following text: job id: 447178
And here is the log:
2025-10-27 16:29:23 INFO : Job store: /srv/local/documents/d740/d7400d37227fc04498e4db2fcb79f8cc/447178
2025-10-27 16:29:23 INFO :
Your choices:
Container to be used: false
Whether the method needs network access: false
SASHIMI method to be called: false
Title column: Article_Title
Time column: Publication_Year
URL field: DOI_Link
? "URL template (\"URL field\" \u2192 {})\n"
: '{}'
Domain selection: ''
Add columns to workbook:
- Article_Title
2025-10-27 16:29:23 INFO : HTTP Request: GET http://10.1.59.221:3000/api/project/256170006910/analysis/5042350006910 "HTTP/1.1 200 OK"
2025-10-27 16:29:23 INFO : Loaded config: /srv/local/documents/86bf/86bf0336827b8133c7849055932845eb/447168
2025-10-27 16:30:33 ERROR : Exit: job failed.
2025-10-27 16:30:33 ERROR : <class 'valueerror'="">: Unknown URL type: nan
2025-10-27 16:30:33 ERROR : Traceback:
Ni! Hi Dan,
I see the problem, there are some cells in your data’s “DOI_Link” column that contain the literal text “nan”, which is not an URL that the XSLX (Excel) format accepts.
We have added a check to make sure this situation no longer causes an error.
If you rerun the analysis you should get the results.
And thanks a lot for reporting the issue.
Hi Ale,
This worked once yesterday, but I’m trying to run Domain Maps with a different domain-topic model, and it’s getting an error again:
Domain Maps did not finish successfully. See the job log for additional information. To obtain help with this issue, ask us at the forum and include the following text: job id: 447279
2025-10-28 19:49:44 INFO : Job store: /srv/local/documents/2086/2086e87fcb25a25261e32887100d28b1/447279
2025-10-28 19:49:44 INFO :
Your choices:
Container to be used: false
Whether the method needs network access: false
SASHIMI method to be called: false
Title column: Article_Title
Time column: Abstract
URL field: DOI_Link
? "URL template (\"URL field\" \u2192 {})\n"
: '{}'
Domain selection: ''
Add columns to workbook: false
2025-10-28 19:49:45 INFO : HTTP Request: GET http://10.1.59.221:3000/api/project/254780006910/analysis/5043630006910 "HTTP/1.1 200 OK"
2025-10-28 19:49:45 INFO : Loaded config: /srv/local/documents/a902/a9022d6068995bfe7d5fe4ad66e2d095/447275
2025-10-28 19:49:45 ERROR : Exit: job failed.
2025-10-28 19:49:45 ERROR : <class 'typeerror'="">: Method.domain_maps() missing 1 required positional argument: 'extra_cols'
2025-10-28 19:49:45 ERROR : Traceback:
Ni! Hi Dan,
I see the issue, the application ain’t passing on to the method the default value for the “Add columns to workbook” parameter.
I’ve made sure there is a default value internal to the method, and now it works. That’s a new option, which as you can tell could have been more thoroughly tested 😉
Thanks again for reporting.