You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi all, when I try to run the example given in the phylogeography with travel history (but using a tree instead of reconstructing the jumps for a single taxon), I get this error.
dr.app.tools.TaxaMarkovJumpHistoryAnalyzer
Error: Could not find or load main class dr.app.tools.TaxaMarkovJumpHistoryAnalyzer
I am using the source code version of BEASTX to do this, can you guide me on how to launch it?
The text was updated successfully, but these errors were encountered:
Dear All, I have the same problem, RicardoRH96. Currently Im trying to do some analysis with covid sequences, and some papers call the TreeMarkovJumpHistoryAnalyzer tool however it is not clear how is the command line on BEAST to do this section. I was checking the tutorial "Accommodating individual travel history in discrete phylogeographic diffusion" and I was available to replicate the protocol for a taxa specified in the command line, however I still have the doubt if it is something similar to be able to use the TreeMarkovJumpHistoryAnalyzer tool or is it a command that must be specified?
You can run the tool using java -cp <path to beast jar file> dr.app.tools.TaxaMarkovJumpHistoryAnalyzer -help
If you're using the source code directly you can build the JAR file using ant and then run java -cp build/dist/beast.jar dr.app.tools.TaxaMarkovJumpHistoryAnalyzer -help.
Hi all, when I try to run the example given in the phylogeography with travel history (but using a tree instead of reconstructing the jumps for a single taxon), I get this error.
dr.app.tools.TaxaMarkovJumpHistoryAnalyzer
Error: Could not find or load main class dr.app.tools.TaxaMarkovJumpHistoryAnalyzer
I am using the source code version of BEASTX to do this, can you guide me on how to launch it?
The text was updated successfully, but these errors were encountered: