Hi Steve,
The livetime used was 92% which is the 8 year average capacity factor
(percent of maximum power output averaged over time, including
ouatages). For the last 4 years the plant has been perfrming at 95% or
better, so I guess you couls say that I have assumed 3% deadtime from
veto cuts. My consern with the spectral analysis is that the EXACT same
shapes are used to determine the "data" spectrum and the fit sample.
There is no systematic error in the shapes. This leads to a better
than possible result from the shape analyses. How muh better is not
clear to me. The other major difference between the shape and couinting
is that we do not use any background spectral information to reject
background. This is the main difference between the counting and
shape+counting analysis in the allowed delta m^2 region. In other
words, most of the sensitivity comes form the rate alone, but the fact
that the shape fits use take advantage of the background spectral
information make them appear more sensitive. In reality this is not a
restriction that we would place on the counting analysis, and I'll look
for ways to fix this in the sensitivity code.
I think that your point about independant analyses is correct, but it is
important to keep in mind that the two methods are not comperable in
sensitivity (see my last comment). If sin^2 2 theta_13 is around 0.02
we will very quickly see a strong effect in the counting analysis but
will have to integrate for quite a while longer to see the shape effect.
On the other hand if sin^2 2 theta_13 is very small, say 0.07, we may
never see a significant effect in the rate alone.
Anyway, here are the 1 year and five year senarios that you asked for...
1 year, 0.6% relative normalization error: (sense_0.6_1yr.ps)
Delta m^2 (eV^2)
0.0015 0.0020 0.0025 0.0030
----------------------------------------------
Counting Only 0.0207 0.0147 0.0126 0.0123
Shape Only 0.0242 0.0205 0.0190 0.0157
Counting+Shape 0.0159 0.0121 0.0106 0.0099
5 years, 0.6% relative normalization error: (sense_0.6_5yr.ps)
Delta m^2 (eV^2)
0.0015 0.0020 0.0025 0.0030
----------------------------------------------
Counting Only 0.0168 0.0119 0.0102 0.0100
Shape Only 0.0126 0.0108 0.0100 0.0078
Counting+Shape 0.0096 0.0076 0.0068 0.0060
1 year, 8% movable detectors: (sense_md_1yr.ps)
Delta m^2 (eV^2)
0.0015 0.0020 0.0025 0.0030
----------------------------------------------
Counting Only 0.0173 0.0123 0.0105 0.0103
Shape Only 0.0246 0.0206 0.0190 0.0159
Counting+Shape 0.0143 0.0107 0.0093 0.0088
5 years, 8% movable detectors: (sense_md_5yr.ps)
Delta m^2 (eV^2)
0.0015 0.0020 0.0025 0.0030
----------------------------------------------
Counting Only 0.0097 0.0069 0.0059 0.0058
Shape Only 0.0129 0.0110 0.0102 0.0080
Counting+Shape 0.0068 0.0050 0.0044 0.0042
Enjoy,
Jon
Steve Biller wrote:
> Thanks Jon, those are very interesting numbers! What livetime did you assume?
>I think it'd be worth doing this for both a 1 year and 5 year run time.
>What background shape uncertainties in particular are you worried about
>with regard to the spectrum analysis?
>
> I'd also like to make the general point that I think it is a mistake
>to discuss the capabilities of a combined counting+shape analysis very much.
>The MUCH bigger deal is that we can look for this with 2 INDEPENDENT techniques
>which have comparable sensitivity and we should stress our belief that
>this is key. I very strongly believe this. This is the thing that fully
>justifies our fiducial volume and is one of the important features that
>sets us apart from a number of other experiments. As I said before, I
>think that redundancy is the thing that will sell this project.
>
>
> - Steve
>
>
>
>Jonathan Link wrote:
>
>
>>Hi All,
>>
>>Here are the new numbers for the sensitivity of the baseline
>>experimental setup. I'm also attaching ps files showing the sensitivity
>>as a function of delta m^2 and sin^2 2 theta_13. Sensitivities are shown
>>at the 90% CL.
>>
>>1) Assuming 0.6% relative normalization error (i.e. no sensitivity gain
>>from movable detectors). This is the official baseline scenario. The
>>corresponding ps file is sense_0.6.ps.
>>
>> Delta m^2 (eV^2)
>> 0.0015 0.0020 0.0025 0.0030
>>---------------------------------------------------------------
>>Counting Only 0.0175 0.0124 0.0106 0.0104
>>Shape Only 0.0155 0.0133 0.0124 0.0098
>>Counting+Shape 0.0112 0.0087 0.0078 0.0070
>>
>>2) Assuming cross calibration with movable detectors for 8% of the run
>>(0.26% relative normalization error). The corresponding ps file is
>>sense_md.ps.
>>
>> Delta m^2 (eV^2)
>> 0.0015 0.0020 0.0025 0.0030
>>---------------------------------------------------------------
>>Counting Only 0.0113 0.0080 0.0069 0.0068
>>Shape Only 0.0158 0.0136 0.0126 0.0101
>>Counting+Shape 0.0086 0.0064 0.0056 0.0053
>>
>>A few comments... The shape analyses still make optimistic assumptions
>>about our knowledge of the background spectra shapes. This does not
>>affect the Counting analysis. Therefore we should use the sensitivity
>>of the Counting analysis as the upper limit of sensitivity and the
>>Counting+Shape analysis as the lower limit on sensitivity. We know that
>>we can do better than the counting analysis but the Counting+Shape
>>sensitivity is perhaps too optimistic.
>>
>>Enjoy,
>>Jon
>>
>> --------------------------------------------------------------------------------
>> Name: sense_0.6.ps
>> sense_0.6.ps Type: Postscript Document (application/postscript)
>> Encoding: 7BIT
>> Download Status: Not downloaded with message
>>
>> Name: sense_md.ps
>> sense_md.ps Type: Postscript Document (application/postscript)
>> Encoding: 7BIT
>> Download Status: Not downloaded with message
>>
>>
This archive was generated by hypermail 2.1.8 : Fri Sep 10 2004 - 03:28:24 CDT