[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: file size limit exceeded



Blake,

The stan cluster at IU does not have this limitation either in local 
ext2/3 mounts or when they are nfs mounted.  I need to look into this 
further.  I will try to reproduce the problem.  It may have to do with 
which clib versions get called (open vs open64, fread/fwrite vs 
fread64/fwrite64).  I will get back to you today.

-Richard Jones

Blake Leverington wrote:
> Thanks Richard,
>
> The issue I ran into is directly related to hddm files. I do not produce 
> any hbook or other.  Why I suspected a 2GB limit was when I looked at 
> the log files from the jobs I had run I see at the very end "Filesize 
> limit exceeded".  I look at the hddm file size and it's slightly larger 
> than 2GB. This happened for both bggen and hdgeant. I had thought, like 
> you said, that hddm was unlimited, so now I'm suspecting it's the NFS 
> filesystem I'm running on (stan@indiana). I guess I'll just have to 
> split up my jobs more.
>
> Cheers,
> -Blake
>
> Richard Jones wrote:
>   
>> Hello all,
>>
>> There are no file size limits in HDDM, apart from those issuing from 
>> the OS/filesystem on which the file is being written.  I know there is 
>> no 2GB cutoff because I routinely exceed that limit.  The design of 
>> HDDM has unlimited stream length because it is sequential access.  
>> Some people, myself included, like to write out ntuples as a 
>> side-effect of the simulation, either to monitor conditions in the 
>> simulation or do special simulation studies.  If you are looking at 
>> the size of the hbook file produced in the simulation, I know that 
>> hbook has a hard 2GB filesize limit.  Is that what you are running into?
>>
>> -Richard Jones
>>
>>
>>     

S/MIME Cryptographic Signature