-
Notifications
You must be signed in to change notification settings - Fork 23
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
none results generating with huge dump files #137
Comments
Hi Kamil, Thanks for the clear documentation and awesome tool. |
Hi hi, sorry for the silence, I am on a paternity leave. We are very close to release a new version, which is now stalled for rising an offspring, but it's actually a well functioning version. If you take a look at the BGA23 tutorial, we already thought the update. THe new updated version is about 100000x faster (I exaggerate a bit, but just a bit - it's really a lot faster, the backend is written by Gene Myers). Anyway, if you take a loot at that, you might be able to run smudgeplot on a lot bigger genomes in a fraction of time: https://bga23.org/smudgeplot/ |
How does the bga link address this problem? I cannot find a solution there. Thanks. |
Is this software no longer supported? |
Ehm? Are you serious? |
It explains how to work with a development version of the software. If it's not clear enough for you, you have to wait for a proper release I am afraid. |
Dear developers,
I encountered some issues while using Smudgeplot to predict polyploidy. The species I am studying is expected to be hexaploid based on flow cytometry analysis, but whether it is autopolyploid or allopolyploid is unknown. The size of the dump file extracted using Jellyfish is 16GB, and smudgeplot.py hetkmers stage has been running for several days without producing any results, although the program is indeed running. I extracted 2GB of data from the dump file and ran it again, but the prediction still shows a diploid AB type. Can you please help identify the cause of this issue and suggest any methods to speed up the process?
The remaining free RAM more than 500G.
The text was updated successfully, but these errors were encountered: