set up Boss environment


copy your dst files to HDFS

usually this is done by the Hadoop admins, and you can find your files under /hdfs/ directory on the hadoop nodes.
ls /hdfs/
if you want to copy files, please contact

Running analysis jobs on Hadoop cluster

1. login to one of the hadoop node ( ~ using your AFS account

if you have problems login to the hadoop nodes, please contact us with your AFS account

2. run the job

boss.hadoop jobOptions_ana_rhopi.hadoop.txt

note: The hadoop job will proccess all the dst files in $Hadoop.InputDir , and output the result root file to $Hadoop.OutputDir . So the jobOption file is similiar with the option file that runing on localhost or PBS except two params:

// Input REC or DST file name 
Hadoop.InputDir = "/hdfs/offline/data/662-1/jpsi/dst/";
//EventCnvSvc.digiRootInputFile = {"rhopi.dst"};

Hadoop.NtuplePath = "FILE1";
Hadoop.OutputDir = "/publicfs/zangds/662-1/jpsi/root/";
//NTupleSvc.Output = { "FILE1 DATAFILE='/tmp/run_0028375_All_file001n2.root' OPT='NEW' TYP='ROOT'"}; 
Edit | Attach | Watch | Print version | History: r5 < r4 < r3 < r2 < r1 | Backlinks | Raw View | Raw edit | More topic actions...
Topic revision: r3 - 2012-10-25 - Donal
  • Edit
  • Attach
This site is powered by the TWiki collaboration platform Powered by PerlCopyright © 2008-2021 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding TWiki? Send feedback