So this is Xmas And what have you done?

I joined BioHackathon2018 held at Matsue. I also joined Common Workflow Language (CWL) Workshop Tokyo 2018 held at Shibuya prior to the hackathon, and learned a little bit about CWL.

I have other tasks (update searchability of All of gene expression (AOE)) to be discussed in the hackathon, but it was a very good chance to implement my data analysis pipeline with CWL for making an index for AOE. I asked Ishii-san about the possibility of CWLization for non-docker workflow, and he said 'It definitely works!'. So I tried to code it concurrently.

I wrote my code with a great reference material (in Japanese) which was shown in CWL Workshop Tokyo 2018.

For the reference, my original script to run was like this.

perl \
| pigz -c > xRX.json.gz

First I tried to make CommandLineTool by modifying the sample code for my case. Because my code requires an additional file which tells the IP address of API server, I failed to do it initially. With a great help from Ishii-san, I could do it finally (perl-gethoge.cwl). The files to be used should be described in inputs:.

Then, other part (CommandLineTool) for the workflow was coded with the name pigz.cwl which compresses the output from the other command.

Finally the Workflow named gethoge-and-pigz.cwl was coded by looking at the great reference described above. There was some trials in setting the output files, but it was not so hard compared with the former issue (additional file inclusion).

These cwl files were successfully pushed to GitHub pages and thus visualized with the power of public web service, where you can see our workflow in a diagram representation. And, my workflow can be run with the command below now!

cwltool perl-gethoge.cwl --file  --ipfile IP.txt

Of course, IP.txt must be specified (not in GitHub repository now, though).

Thanks a lot, Ishii-san and other hackers in CWL community especially for Michael who gave me a lot of chocolates both at Tokyo and Matsue!

Written by bonohu in DBCLS on 金 21 12月 2018.