[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: A proposal for setting a BCAL threshold




David,

Thanks much for taking the time to read through and respond -- I've  
put more responses below.

On Jun 7, 2007, at 11:17 PM, David Lawrence wrote:

>    It seems the method here is to use a DAQ limit of 5% on the BCAL  
> occupancy to imply the threshold we would need to set on the cell  
> energy. I'm guessing the DAQ limitation would have to come from  
> network bandwidth or some self-imposed disk space limitation. (The  
> modules themselves would still have to analyze every channel to to  
> determine sparsification so that bottleneck is a function of  
> trigger rate, not BCAL rate). Your premise that the fADC works like  
> an integrating ADC and that the pedestal width is all due to dark  
> rate means the  7 photoelectrons you calculate is an upper limit  
> for a fixed occupancy and dark rate. Since presumably a real fADC  
> would be able to reject many of the dark-rate-only events, it would  
> fold in an acceptance probability function to your Poisson to give  
> the probability distribution of accepted dark rate events.  
> Integrating the tail of that distribution *in* from infinity until  
> you get a total probability of 5% would lead to fewer  
> photoelectrons. This, of course, assumes the dark rate really  
> dominates the BCAL rate compared to real hits. I guess in short, I  
> think your assumption about the fADC operation means the 1.5MeV  
> number should be considered an upper limit using this methodology.

I agree with this -- I think in the end the fADC can probably use  
some fancier processing when it analyzes it's 100 ns buffer in order  
to suppress single PE dark hits.

>    Now, if I understood Richard's comments at the calorimetry  
> workshop, we can expect the sampling fluctuations to translate to  
> at least 10MeV for even the smallest showers we expect to  
> reconstruct (100MeV?). This was the original motivation for setting  
> the threshold here. If that is the case, then adding information  
> from these 2-3MeV cells cannot improve your energy resolution. They  
> can, however, affect the rates if we choose to set the BCAL  
> threshold lower.

That's precisely why I want to go through this exercise -- I want to  
see what happens if we start adding lower energy cells with realistic  
sampling fluctuations.  Mihajlo's studies show that lowering the  
threshold does increase energy resolution considerably, but that was  
with no sampling fluctuations introduced.  (It also increases  
fragmented showers -- so the algorithm will need retuning.)

>    OK, now that I've said all of that, I think the 1.5MeV would be  
> OK, but I would vote for setting the threshold down even lower to  
> something like 1MeV provided it doesn't blow up the data file too  
> much (which I don't think it will). This will allow you to play  
> with the threshold outside of hdgeant. We could even make plots of  
> the BCAL rate as a function of threshold in MeV which could be  
> useful when writing up more detailed specifications for the DAQ.

I should have been more specific here.  I plan to go pretty low in  
hdgeant provided I can get away with it in terms of disk size.  I  
need to look at the sampling fluctuation parametrization.  In the  
BCALMCResponse routines I'll introduce sampling fluctuations and  
electronics thresholds (and eventually probably extra electronics  
noise).  This would then let me change these numbers easily for  
studies and also in principle later in a run and channel dependent  
way once a database interface is available.

-Matt