Dec 11-12/25 PionLT/KaonLT Analysis Meeting Notes ------------------------------------------------- (Notes by GH) Today: PionLT will be discussed first Thursday: Present ----------------- Regina - Garth Huber, Muhammad Junaid, Nathan Heinrich, Alicia Postuma, Vijay Kumar, Nermin Sadoun Virginia - Richard Trotta FIU - Pete Markowitz CUA - Tanja Horn, Chi Kin Tam Ohio - Julie Roche JLab - Dave Gaskell Junaid ------ PionLT Q2=2.85, W=2.02 - continuing preparations for LT-separation - pi+ PID - proposing to use CoinTime and RFtime, no aerogel cut - showed low epsilon RF cut plots where aerogel n=1.03 - applied 1.20.5 NPE cut - negligible difference in MM plot - low epsilon efficiency with RF cut and aero>0.5NPE 0.99646 +/- 0.00024 - high epsilon (where n=1.011) with RFcut and aero>0.5NPE 0.08557 +/- 0.00015 - Tanja: what is the effect of a tighter aerogel cut? - *NB* Alicia has a good suggestion to reverse the RF cut and look at the aerogel and MM histos to see where the background lies - *NB* Nathan comments that the real issue on whether to include the aerogel cut or not is which version gives the smaller systematic uncertainties - including the cut could reduce the cut dependence on other PID detectors - excluding the cut removes the contribution from the aerogel efficiency uncertainty - the question is which secenario gives lower errors overall? - diamond cut: adjusted limits a bit tighter - Garth: suggests some small adjustment to the LL corner, due to the location of the edge of the SIMC distribution there Nathan ------ PionLT systematic uncertainties - looked at Abishek's PhD thesis re. systematic uncertainties for tracking - deviation from straight line fit of tracking vs rate gave systematic - Dave: this gives the point-to-point systematic only - Nathan will apply this to the CoinLumi analysis, it should result in a slightly smaller systematic atributed to the EDTM livetime - will revisit the correlated systematic part later, when assessing LT-separation results - Dave is satisfied with this approach - working on Junaid's LT-sep code for Q2=3.85, W=2.62 - did his own replay, not using Junaid's replay, so this will independently confirm if all calibrations, ME, offsets, etc. are applied correctly - the idea is to double-check Junaid's work, but not change the t-phi binning diamond, or other cuts, and see how close he comes to reproducing Junaid's results - CoinBlocking and pion absorption corrections are applied - shows plots of MM for Data, MC using default physics_iterate.f model - also shows sig_uns using default model, weight recalculation script has not been used yet - currently running Junaid's LT-sep framework with his most recent iteration function and parameters, and will show new plots when that is done - found some deficiencies in Junaid's instructions that have been communicated Alicia ------ pi+n BSA paper, more submission problems - PLB refuses to accept Mississippi State as a valid institution - has tried contact their staff to help - if can't resolve, then we will be required to accept the default copyright agreement (can't see what that is with the institution error, though) u-channel omega analysis - running PYTHIA generator for Q2=3.0, W=2.32 setting - this is the one with the weird double bump MM distribution - spoke w/ Henry Klest, who suggests to turn on rho production to see the difference - indeed, that made the extra MM bump pigger, so this feature appears to be due to the rho - we can't use this version in the data fit, because the ratio of rho to PhaseSpace is hard-wired. It's better to have them separate, so they can be individually fit to the data for a better fit to the background underneath the omega MM peak - now trying to turn off Diffractive rho ID=9900110 and charged rho production ID=213, not just rho0 production - another option would be to disable all resonance decays, to make it easier to identify rho events for exclusion - new statistics table per setting, to guide binning decisions - previous table had no diamond cut, over-estimated high epsilon counts - added a preliminary diamond cut and found #events integrated over u and phi range - Bill had 70 events/u-phi bin as a lower limit in his omega analysis - still evaluating what would be appropriate here - really low #events for Q2=5.5, at best would only be able to establish an upper bound on the cross section (which might still be useful) - Q2=2.1 and 4.4 have decent statistics - Q2=3.0, W=2.32 has good statistics but worse MM resolution - Q2=3.0, W=3.14 is best overall in terms of statistics and resolution Next steps: - will submit a new replay of all data over holidays, due to 0th order ME offset discussed last week - need to determine u-phi binning for all settings - Bill had 3 u-bins for Q2=1.6, 2.2 data - expecting 3 u-bins for both Q2=3.0 settings and 2 u-bins for Q2=2.1, 4.4 - aim is to complete full LT-separation of first setting before Junaid/Nathan complete their theses Chi Kin ------- KaonLT Q2=3.0, W=3.14 background fit (underneath Lambda) - steps are: 1) shift data by ~1 MM bin and subtract Lambda simulation from data 2) then fit the resulting flatter histo with a polynomial and subtract it 3) alternate method replaces the polynomial with a Chebyshev function - this method gives a better description of the RadiativeTail region, while the previous method over-subtracted the tail region - Data/MC yield ratios after this change - in last t-bin, the ratios are closer to 1 - other t-bins are less changed - Data vs MC plots - comparison is not so good for HMS delta, xptar, yptar Next steps: - will replay data over holidays, due to 0th order ME offset, and also to determine CoinBlocking correction Friday: Present --------------- Regina - Garth Huber, Alicia Postuma, Nathan Heinrich, Vijay Kumar, Muhammad Junaid Virginia - Richard Trotta CUA - Chi Kin Tam, Sameer Jain JMU - Gabriel Niculescu, Ioana Niculescu Glasgow - Rachel Montgomery FIU - Pete Markowitz CSULA - Konrad Aniol JLab - Dave Gaskell Richard ------- KaonLT Q2=3.0, W=2.32 LT-sep - investigating t-phi bins with anomalously low Data/MC ratios - the #event threshold per bin was set too high, excluded some low #event bins that had fairly clean Lambda peak, keeping them now - for shown t-phi bin, the high epsilon Data/MC ratio went from ~0.4->0.8 - *NB* Garth: this is good, but the ratio error bar before the fix was very small, makes no sense given the low #events - apparently the error bars were erroneously calculated including the data that was excluded from the ratio calculation - making checks to make sure the exp yield and exp yield error are calculated consistently - summary of current status: Richard analysis: - Q2=3.0, W=2.32 and Q2=4.4, W=2.74 LT-seps in a farily good state - Q2=5.5 has low statistics, parameterization is based on other settings Chi Kin analysis: - Q2=3.0, W=3.14 LT-sep in good shape - Q2=2.115, W=2.92 still much work remains Next steps: - refine model, further fit optimization before holidays - full replay over holidays, as mentioned yesterday Sameer ------ KaonLT CoinTime Blocking correction - loosened coin windows - Q2=5.5, W=3.02: correction=0.96 - Q2=3.0, W=2.32: low rate ~0.99, higher rate ~0.98 in 2 distinct steps - Q2=2.115: ~0.97-0.98 - Nathan: everything looks reasonable Next steps: - still need to look at Q2=0.5 and additional 10.6GeV data - need to finalize systematic uncertainties, following how Nathan is calculating them Question for us: what is SHMS_FPtime variable? - found this in the hcana code, where RawCoinTime is calculated - Dave: the time from all 4 planes is projected to the focal plane using the particle trajectory (from tracking) and particle speed - the time from all 4 planes is then averaged together - this gives a higher precision measurement of the particle arrival time at the focal plane Vijay ----- Low Q2 PionLT LT-sep - modified TT parameterization for Q2=0.375 to give a better fit of sigTT vs t - New: TT=[p1/t^2*exp(p2*t)+p3/t]*sin^2(theta) - Old: same except for p1/t^3 in first term - was not planning to apply this to Q2=0.425 as t^3 seemed fine there - *NB* Garth: suggest to try this at Q2=0.425 to get an idea on how much the cross sections change - starting to work on draft manuscript, working on experiment and data analysis parts - *NB* need to add Alicia to author list Gabriel ------- KaonLT new method to determine kinematic offsets - developed a completely new framework, trying to avoid some limitations inherent in HeepCheck data - HeepCoin data limitations include: - very large correlations between different offsets, leading to large uncertainties - HeepCoin settings are limited to nearly equal angle/momentum settings due to practical considerations, far from momentum/angles where physics data are taken - new python framework, tightly integrated in KaonLT analysis workflow - to reduce the possible offset parameter-space, the first step is to constrain the HMS, SHMS angle offsets - the xptar, yptar distributions are guided by two things: - the edges are determined by the collimator geometry, which is well known - the shape of the distribution is determined by the physics, which is more poorly knon - the idea then is to use on the edge information to constrain the angle offsets - wrote a script that tries 60 different offsets from -6mr to +6mr in both x,y directions for both HMS, SHMS - find the one that best lines up the edges of the data with the MC distribution - defines a Cost function, and minimizes the function to determine the offsets - does not use hcana for these 60 different offsets, as that would be too time consuming - rather, calculates xtar, ytar, xptar, yptar from scratch in the script - the determined offsets are not final, but a narrower range around these offsets are then used in the next step - the obtained agreement between data and MC is pretty good - SIMC is slightly wider than the data in one case, has some ideas to investigate about that - Garth: can you do this with the prescaled Singles data, rather than Coin data? That would avoid any potential complication caused by the correlation between the 2 spectrometers - yes in principle, that would be a better method if the statistics are good. The important thing here is that a Cost function has been defined, which was not possible previously - the next step is to find the delta, and beam energy offsets, while varying the angle offsets in a narrow range about the step-1 offsets - generates a random set of offsets and investigate about the chisquare minimum - the constraint is to reproduce MMpi:neutron MMk:lambda MMk:sigma0 - get good agreement between data and MC peak positions - shows a histogram of acceptable offsets for each quantity, which is approximately Gaussian in shape - found the yield change due to #events passing MM and other cuts +/-0.05%, taking this as a systematic uncertainty in the unseparated yield - Garth: in addition, we would like to understand the uncertainty in the kinematic reconstructino when comparing bins at low and high epsilon Can we interpret the width of the offset peak as the systematic uncertainty in each offset? - Richard: this would be an upper limit in the systematic, as correlations between the offsets would not be taken into account - as reported earlier, using HallC:p rather than the beam energy value in standard.kinematics - found that tracking the wandering in beam energy with time gave slightly better description of data compared to MC - Garth: please see the Nov 20 notes. The issue is that HallC:p is not corrected for the Arc Energy Measurement. Need to find the value of HallC:p at the time of the Arc Energy Measurement (AEM), and then correct all other values by the ratio Beam=(HallC:p_now)/(HallC:p_AEM)*(AEM-GeV) - *NB* Gabriel will look into this - *NB* Garth: it would be great to compare the offsets determined with this method to those determined from HeepCheck, to better understand systematics - *NB* Please send your slides to Garth for posting on RedMine Next Week Meetings ------------------ - Thurs: Dec 18 @ 16:00 Eastern/15:00 Regina - KaonLT will go first - Fri: Dec 19 @ 11:00 Eastern/10:00 Regina - we will continue where we left off - one of the things to discuss is our meeting schedule for 2026. Please look at your calendar so we can decide whether we need to move the meeting time or not.