Conversation
Hello, I ran into the "error while loading shared libraries: libiconv.so.2: cannot open shared object file: No such file or directory" error when running chainCleaner manually. I was able to resolve this error by adding the "libiconv" package into my existing conda env. I found it useful to have this yml file on hand.
Merge remote-tracking branch 'upstream/main'
7e320c4 to
747b76d
Compare
|
Hi folks, I have created another standalone PR to fix the twobitreader problem only, without touching anything else. That one is more urgent for users processing larger genomes, and I'm doing testing with real data on my side. I'll focus on updating the nf-core pipeline structure for this PR. Please hold on to review this PR, I'm adding some new functionalities and doing more testing on my side. Thanks! |
- Merge conf/base.config + conf/modules.config into single nextflow.config (5 labelled sections: params, withLabel, withName, profiles, reporting) - Add process_fast label (2 h) for short-lived steps; SLURM dynamic partition routing: htc < 4 h, public >= 4 h, --qos=public - Add check_max() resource ceiling helper against partition hard limits - Add process.array job arrays for LASTZ, AXT_CHAIN, REPEAT_FILLER - Disable conda by default; apptainer is the production environment - Add FROM_FILL_CHAINS and FROM_CLEAN_CHAINS entry workflow aliases with per-entry parameter validation - Remove dead params: fill_chain_min_score, seq1_limit, seq2_limit, fill_prepare_memory, chaining_memory, fill_memory, chain_clean_memory - Dockerfile: switch to full Kent rsync distribution; fix NetFilterNonNested.perl URL (raw.githubusercontent.com, commit fbdd299) - environment.yml: add ucsc-twobitinfo - bin/partition.py: document seq_limit as never-implemented - README, TODO, CHANGES_nfcore_refactor: updated throughout Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
…ions - Replace placeholder docker://YOUR_REGISTRY/... with docker://nilablueshirt/make_lastz_chains:latest-amd64 in nextflow.config - Rewrite README.md: section 1 (old make_chains.py), section 2 (nf-core local), section 3 (nf-core SLURM HPC) - Mark Docker image task done in TODO.md and CHANGES_nfcore_refactor.md Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
LASTZ build from source requires the zlib development headers (zlib1g-dev). Without it the make step fails with missing zlib.h. Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
- params.json: new file with all scientific parameters (genome paths,
LASTZ settings, fill/clean options) with defaults and inline comments
- nextflow.config: remove scientific params from params{} block; keep only
outdir, resource ceilings (max_memory/cpus/time), and nf-core boilerplate
- README: update all run commands to use -params-file params.json
Users now edit params.json for each run and leave nextflow.config alone
unless changing compute resources or the container image.
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
- Add ToC with anchor links at the top - Wrap each of the three run sections in <details>/<summary> dropdowns (GitHub Markdown renders these as collapsible blocks) Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
process_fast: 0.5 h, 16 GB (genome prep, partition, LASTZ, cat, bundle, filter) process_single: 1 h, 16 GB (repeat filler) process_medium: 2 h, 50 GB (PSL sort, axtChain, merge) process_high: 3 h, 100 GB (chainCleaner) SLURM queue cutoff updated to < 2 h → htc, >= 2 h → public. README routing table updated to match. Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
|
Hey, I think that is a better idea and helps a lot with the review. Actually, all this changes were planned on my side but just pushed away with time. Thanks for all the effort! One side note, let's make sure the results are reproducible between the current version and what is going to be the new one. Best, |
|
@alejandrogzi Thanks! I have broken up the changes into three new PR and closed this one, so we can divide and concur : ) I'm also happy to meet with you to go over the details. Please feel free to send me an email: nilmu@asu.edu |
Hello folks,
Sorry for this giant pull request. I tried refactoring the entire code base to follow the latest nf-core standards/formats, and replaced
twobitreaderwithpy2bit, as the latter supports 64-bit input.Please check
CHANGES_nfcore_refactor.mdfor all the changes.I'll test this new codebase on my end with Simone's data and let you know how it goes. And I'm happy to help with uploading the pipeline to nf-core, and the dockerfile to docker hub, once we are sure everything is working as intended.
Regards,
Nil