__group__ ticket summary component version type owner status created _changetime _description _reporter v4r0 Release 64 Toy generation with constraints should follow guidelines in arXiv:1210.7141 defect Thomas Latham new 2016-03-14T14:47:38Z 2016-09-09T17:32:55+01:00 "Generating and fitting toy experiments where the PDF contains Gaussian constraints is a slightly tricky business. This is discussed in [http://arxiv.org/abs/1210.7141 arXiv:1210.7141] and we should try and follow the guidelines therein. It would seem that the missing piece (wrt what we currently do) is to generate a different value for the mean of the constraint for each toy experiment. Since the PDF construction can change between generation and fitting it would seem to make sense to actually do this in the setup of the fit. But this brings up some concerns about reproducibility. Needs some thought." Thomas Latham v4r0 Release 62 Allow correlated uncertainties on an efficiency histogram enhancement Daniel Craik new 2015-11-04T16:46:53Z 2015-11-04T16:46:53Z Add functionality to allow the bins of an efficiency histogram to be fluctuated with local correlations (based on a scale length). Daniel Craik v4r0 Release 68 Implement new Kpi S-wave description enhancement new 2017-01-19T10:17:34Z 2017-05-11T12:23:42+01:00 "Antimo Palano and Mike Pennington have made a new description of the Kpi S-wave: https://arxiv.org/abs/1701.04881 We should implement in the package as a new ""resonance""." Thomas Latham v4r0 Release 74 Extend LauArgusPdf to the generalised form where p is not fixed to 1/2 enhancement Thomas Latham new 2017-09-26T18:08:49+01:00 2017-09-26T18:08:49+01:00 "The form for the ARGUS function in Laura++ is currently that where the power is fixed to 1/2. We can extend to allow this to take other values (and indeed to be floated). The complication is that the analytical integral only works for this special case, so we'd have to check whether p=1/2 and is fixed, otherwise revert back to numerical integration." Thomas Latham v4r0 Release 48 Further efficiency savings in integral code enhancement Thomas Latham new 2014-10-17T17:47:55+01:00 2016-09-13T11:17:32+01:00 "As mentioned in comment:27:ticket:11 there are still some large savings that could be made in the recalculation of the integrals. These should be investigated and implemented if feasible." Thomas Latham v4r0 Release 2 Time-dependent fit model task Thomas Latham assigned 2013-10-09T08:20:32+01:00 2014-10-10T11:42:17+01:00 "Include a fit model for performing time-dependent Dalitz-plot analysis. Overhaul existing time-dependent model (in BaBar version of code) to fix any bugs (check SCF implementation) and apply changes to speed up and make backgrounds less B-factory specific (as has been done in the other fit models). Add to a branch along with time pdf. Create ticket for Rafael to make both more general (i.e. add hyperbolic terms etc.) Once all completed merge into the trunk for new version." Mark Whitehead v4r0 Release 12 Create DP-dependence wrapper for PDFs enhancement Thomas Latham new 2013-12-02T15:10:42Z 2014-10-10T11:43:32+01:00 "At present we have a class for e.g. Gaussian PDF and another for the DP-dependent version. Investigate the possibility of having a wrapper class that can modify any PDF to give it dependence on the DP (or indeed any other variable)." Thomas Latham v4r0 Release 15 Allow convolutions of PDFs enhancement Rafael Silva Coutinho new 2013-12-04T14:51:54Z 2015-09-03T14:34:16+01:00 "Create a PDF class that is the convolution of two others, e.g. ARGUS function and a Gaussian resolution. Can probably use TVirtualFFT to perform the actual calculations." Thomas Latham v4r0 Release 31 Asymmetric errors on remaining histogram classes enhancement Daniel Craik new 2014-04-15T02:17:06+01:00 2015-11-04T13:08:10Z #25 introduces asymmetric errors for the efficiency histogram classes. For completeness the other histogram classes should have this feature. Daniel Craik v4r0 Release 38 "New ""command"" to fill a 2D histogram with the value of fitted function" enhancement Thomas Latham new 2014-06-09T17:00:50+01:00 2017-09-11T14:44:02+01:00 For calculating the 2D chisq one currently has to generate large numbers of toy experiments and histogram the generated square DP distributions. Would be better to have a routine that calculates the appropriate PDF value at the bin centre and multiplies it by the appropriate Jacobian to fill the histogram. This could then be compared with the data distribution. Thomas Latham v4r0 Release 47 Improve handling of cut-off in LASS amplitude enhancement Thomas Latham new 2014-10-13T12:30:18+01:00 2015-09-16T13:57:55+01:00 Thomas Latham v4r0 Release 14 Reorganise the data storage task Thomas Latham new 2013-12-02T16:18:08Z 2015-09-03T14:33:21+01:00 "At present the data is read in via a TTree in LauFitDataTree. Then when all the PDFs are asked to cache information they make their own local copies of their abscissas as well as the likelihood value or some intermediate calculated values. In fits with very large data samples this could cause memory problems. Better to have only one copy of the data in memory and give the PDFs references to their abscissa values. They can still store locally their likelihood values and/or intermediate values. In the case of extremely large statistics even this might break down so think about what to do then. Perhaps just use the TTree (with tweaked buffering options)?" Thomas Latham v4r0 Release 13 Read currently hard-coded values from data files enhancement Thomas Latham new 2013-12-02T15:15:18Z 2015-09-03T14:31:58+01:00 "The particle masses, resonance parameters, lists of allowed daughters/parents are currently hard-coded within various classes/namespaces. Would be better to have these read in from data files." Thomas Latham