2GB Limit?

Support for HandBrake on Linux, Solaris, and other Unix-like platforms
Forum rules
An Activity Log is required for support requests. Please read How-to get an activity log? for details on how and why this should be provided.
keller
Posts: 11
Joined: Thu Nov 08, 2007 4:04 pm

2GB Limit?

Post by keller »

I've having an issue with HandBrakeCLI in which the program crashes after outputting 2 GB. I don't believe that this is a filesystem issue, since the filesystem is XFS.

Here's my HandBrakeCLI string:

Code: Select all

HandBrakeCLI -i /dev/cdrom -L -T -e x264 -2 -q .6 -U -F -E ac3 -a 1 -6 6ch -o "MovieName (2000).mkv"
And here's the result of 'mount':

Code: Select all

/dev/hda2 on / type ext3 (rw,errors=remount-ro)
proc on /proc type proc (rw,noexec,nosuid,nodev)
/sys on /sys type sysfs (rw,noexec,nosuid,nodev)
varrun on /var/run type tmpfs (rw,noexec,nosuid,nodev,mode=0755)
varlock on /var/lock type tmpfs (rw,noexec,nosuid,nodev,mode=1777)
udev on /dev type tmpfs (rw,mode=0755)
devshm on /dev/shm type tmpfs (rw)
devpts on /dev/pts type devpts (rw,gid=5,mode=620)
lrm on /lib/modules/2.6.22-14-generic/volatile type tmpfs (rw)
/dev/hda3 on /home type xfs (rw)
/dev/hda1 on /media/xp type fuseblk (rw,nosuid,nodev,noatime,allow_other,default_permissions,blksize=4096)
securityfs on /sys/kernel/security type securityfs (rw)
binfmt_misc on /proc/sys/fs/binfmt_misc type binfmt_misc (rw,noexec,nosuid,nodev)
/dev/hdb on /media/cdrom0 type udf (ro,noexec,nosuid,nodev,user=keller)
Files are being written to my home directory, so I know they're going to a XFS partition.

I've compiled HandBrakeCLI from svn to be sure that this isn't a bug that's been addressed. So, this is using the absolute newest HandBrake.

Any ideas on what else might be causing any file over 2 GB to fail? Thanks for your help!

-Keller
jbrjake
Veteran User
Posts: 4805
Joined: Wed Dec 13, 2006 1:38 am

Post by jbrjake »

First off, -L is only for mp4. It doesn't affect mkv.

Second, you're using AC3. So why are you trying to do a 6-channel discrete AC3 downmix? The flag will just be ignored.

Third, I would never ever suggest using constant quantizer with x264. Use CRF. It's better. If you did that and applied some sensible x264 opts, like adding b-frames, you probably wouldn't even be outputting over 2 gigs.

Saintdev will have to chime in with any libmkv-specific size limits.
keller
Posts: 11
Joined: Thu Nov 08, 2007 4:04 pm

Post by keller »

Thanks for your response:

I thought that -L was to detect the longest title on the DVD. It doesn't work for mkv?

The -6 6ch is mostly leftover from all the messing around with my string that I did. It worked, so I left it in. I'll remove it now.

What kind of x264 options might you suggest besides b-frames? I'm afraid I'm not familiar with custom options for x264. Any hints you could give would be well appreciated.

I'm also going to try the rip on another machine running an older version of Ubuntu. We'll see if that fixes my error.

Thanks again!

-Keller
jbrjake
Veteran User
Posts: 4805
Joined: Wed Dec 13, 2006 1:38 am

Post by jbrjake »

keller wrote:I thought that -L was to detect the longest title on the DVD. It doesn't work for mkv?
My bad. I confused --longest with --large-file. Yeah, -L works with mkv.
What kind of x264 options might you suggest besides b-frames? I'm afraid I'm not familiar with custom options for x264. Any hints you could give would be well appreciated.
bframes=6 as well as the denoise filter on "weak" should cut down on bitrate when using constant quality. Also apply the -Q flag to use crf instead of cqp.
Polygon
Novice
Posts: 72
Joined: Wed Oct 24, 2007 1:36 pm

Post by Polygon »

im also having this problem.....is it a limitation of x264 that it cant be over 2 gb or something? The string ive used has worked fine for all the movies ive ripped so far, its just that im trying to rip the lord of the rings and since its like 3 hours long i guess its going over 2 gb.....

anyway here is my string...i dont know much about ripping dvds so yeah...
mark@Cactus-Fantastico:~$ HandBrakeCLI -i "/media/cdrom0/VIDEO_TS" -o "/media/LinuxBackup/Ripped DVDs/The_Lord_of_the_Rings_1_-_The_Fellowship_of_the_Ring_x264.mkv" -e x264 --crop 0:0:0:0 -b 1500 -2 -B 160 -R 48
and it just exits out with
Encoding: task 2 of 2, 84.05 % (17.34 fps, avg 18.52 fps, ETA 00h36m47s)File size limit exceeded (core dumped)
jbrjake
Veteran User
Posts: 4805
Joined: Wed Dec 13, 2006 1:38 am

Post by jbrjake »

As far as I know, "file size limit exceeded" is not a HandBrake error, but rather something your OS is giving you. Check if your user account has a max file size limit or something.
Polygon
Novice
Posts: 72
Joined: Wed Oct 24, 2007 1:36 pm

Post by Polygon »

thats the thing, im ripping it to a ext3 filesystem which the max file size is 2 terabytes....even fat32 can allow 2 gb file sizes!

not to mention im looking at a 18.6gb file on the same drive i was ripping the movie to...... so i dont have like a user restriction on how big files can be.

im like the orignal poster.....we are both ripping it to file systems that allow well over +2gb files.....and its still giving this the error. are you SURE its not a bug?
jbrjake
Veteran User
Posts: 4805
Joined: Wed Dec 13, 2006 1:38 am

Post by jbrjake »

Polygon wrote:are you SURE its not a bug?
*blink*
You're a Linux user. If you don't believe me, svn co the source code, download all the contrib libraries, and grep them yourself. I guarantee you will not find the string "File size limit exceeded" anywhere in there. Ergo, not a HandBrake bug. It's system level.

There are hundreds of google hits for this exact message, all from different applications, and in every case the answer is the same: the message is *not* from the app or a bug in the app, but, rather, the user has set limited file sizes.

Your chosen file system is immaterial. You need to RTFM on ulimit or seek help from your distro's first line support.
Polygon
Novice
Posts: 72
Joined: Wed Oct 24, 2007 1:36 pm

Post by Polygon »

the funny thing is....ulimit has no man page xD

anyways, im trying to figure this out, but ulimit -a tells me that filesize is unlimited:
mark@Cactus-Fantastico:~$ ulimit -a
core file size (blocks, -c) unlimited
data seg size (kbytes, -d) unlimited
scheduling priority (-e) 0
file size (blocks, -f) unlimited
pending signals (-i) 8191
max locked memory (kbytes, -l) 32
max memory size (kbytes, -m) unlimited
open files (-n) 1024
pipe size (512 bytes, -p) 8
POSIX message queues (bytes, -q) 819200
real-time priority (-r) 0
stack size (kbytes, -s) 8192
cpu time (seconds, -t) unlimited
max user processes (-u) 8191
virtual memory (kbytes, -v) unlimited
file locks (-x) unlimited
mark@Cactus-Fantastico:~$

and im searching via google and all of them are essentially that they are trying to copy to a file system that has filesystem limits (aka fat32 with 4 gb).....

i shall continue searching for a solution
jbrjake
Veteran User
Posts: 4805
Joined: Wed Dec 13, 2006 1:38 am

Post by jbrjake »

Polygon wrote:the funny thing is....ulimit has no man page xD
Sure it does. If you don't have one, again, complain to your distro.
keller
Posts: 11
Joined: Thu Nov 08, 2007 4:04 pm

Post by keller »

Well, I've done some additional research on this.

I don't think it's caused by HandBrake, since there are obviously other people using it without issues.

I'm compiling from SVN and still getting the issue.

It seems to be a problem in Ubuntu in specific. Most people having the issue are dealing with file systems that have a cap. In some cases, Samba is the issue. I've tested XFS and JFS locally and still have the problem.

I think that it might be a kernel issue. I read somewhere that someone got it fixed by installing a 686 kernel instead of the 386 one. I may try recompiling my kernel to see if this is the issue.

Any other ideas or solutions anyone finds, please let me know!
keller
Posts: 11
Joined: Thu Nov 08, 2007 4:04 pm

Post by keller »

Interesting. This would seem to indicate that it's NOT a kernel issue (I think):

Code: Select all

keller@KellerDesktop:~/dvdtemp$ dd if=/dev/zero of=file_3GB bs=1000000 count=3k
3072+0 records in
3072+0 records out
3072000000 bytes (3.1 GB) copied, 87.6755 seconds, 35.0 MB/s
keller@KellerDesktop:~/dvdtemp$ ls -all
total 6604772
drwxr-xr-x  2 keller keller        147 2007-11-22 23:45 .
drwxr-xr-x 66 keller keller       4096 2007-11-22 23:27 ..
-rw-r--r--  1 keller keller  526725509 2007-11-22 01:33 Cry Freedom (1987).mkv
-rw-r--r--  1 keller keller 3072000000 2007-11-22 23:48 file_3GB
-rw-r--r--  1 keller keller  499379310 2007-11-21 17:16 Grumpier Old Men (1995).mkv
-rw-r--r--  1 keller keller  517685441 2007-11-21 19:34 Man Who Knew Too Much, The (1934).mkv
-rw-r--r--  1 root   root   2147483647 2007-11-22 11:57 SizeTest.mkv
keller@KellerDesktop:~/dvdtemp$ 
Is maybe one of the libraries that's being compiled not being compiled with LFS support? Would using the Linux binary make any difference?

-Keller
keller
Posts: 11
Joined: Thu Nov 08, 2007 4:04 pm

Post by keller »

Precompiled binaries downloaded from this site do the same thing.

Code: Select all

Applying the following x264 options: bframes=6:denoise=weak
x264 [info]: using cpu capabilities: MMX MMXEXT SSE SSE2 3DNow! 
No accelerated IMDCT transform found
Encoding: task 1 of 1, 35.07 % (32.49 fps, avg 37.35 fps, ETA 00h47m47s)File size limit exceeded (core dumped)
keller@KellerDesktop:~/dvdtemp$ 
String used is:

Code: Select all

HandBrakeCLI -i /dev/cdrom -L -T -e x264 -q .95 -Q -E ac3 -a 1 -x bframes=6:denoise=weak -o "Size Test.mkv"
.95 quality used so that the test doesn't take as long.

Any ideas?
Polygon
Novice
Posts: 72
Joined: Wed Oct 24, 2007 1:36 pm

Post by Polygon »

i already stated that i have 19+ gb files floating around my hard drive, and i created a 3 gb truecrypt volume just the other day to test to see if it would give the "file size too big" error, and it didn't. It is ONLY handbrake that is giving this error. I posted my ulimit up there as well and it states that my file size limit is unlimited....whats going on :?
jbrjake
Veteran User
Posts: 4805
Joined: Wed Dec 13, 2006 1:38 am

Post by jbrjake »

Yet again: this is not a HandBrake problem. It is something to do with your Linux distro, and, again, if the problem isn't fixed by tweaking ulimit, you need to talk to your distro's front line support.

How do I know this is not a HandBrake issue, or caused by a HandBrake dependency not having large file size enabled? Because I can create >4GB .mkv files with HandBrakeCLI in Darwin. I just tested it to confirm again. Works fine.

Also, keller, ":denoise=weak" is not a valid x264 option.
keller
Posts: 11
Joined: Thu Nov 08, 2007 4:04 pm

Post by keller »

jbrjake,

I'm not saying that it's a HandBrake problem, I'm just documenting my findings so that other people having this issue can see what I've already tried. If it's a problem to post about HandBrake-related issues caused by factors other than HandBrake, please let me know.

-Keller
Polygon
Novice
Posts: 72
Joined: Wed Oct 24, 2007 1:36 pm

Post by Polygon »

well since its not a handbrake problem, what is handbrake doing to the system that every other program that generates large files isn't?

im trying to ask other people to see if they can help but all of them think its a filesystem issue (like 4 gb on fat 32) and i tell them its on ext3 and they just shrug and say 'i don't know'.

Ill try a previous version of handbrake and see if it has the same problem...
rhester
Veteran User
Posts: 2888
Joined: Tue Apr 18, 2006 10:24 pm

Post by rhester »

It's probably a ulimit issue on your box.

If you do a "dd if=/dev/zero of=~/testfile.bin bs=1024 count=2560000" as the same user ID you run HandBrake under, what is the resulting filesize, and is any error generated?

Rodney
Polygon
Novice
Posts: 72
Joined: Wed Oct 24, 2007 1:36 pm

Post by Polygon »

rhester wrote:It's probably a ulimit issue on your box.

If you do a "dd if=/dev/zero of=~/testfile.bin bs=1024 count=2560000" as the same user ID you run HandBrake under, what is the resulting filesize, and is any error generated?

Rodney
if you go to the first page, keller did that and it did it successfully.

but here is my output as well:
mark@Cactus-Fantastico:~/Desktop$ dd if=/dev/zero of=~/testfile.bin bs=1024 count=2560000
2560000+0 records in
2560000+0 records out
2621440000 bytes (2.6 GB) copied, 515.44 seconds, 5.1 MB/s
mark@Cactus-Fantastico:~/Desktop$
same username as i tried to rip that dvd with handbrake with and everything.

and here is my ulimit -a output again:
mark@Cactus-Fantastico:~/Desktop$ ulimit -a
core file size (blocks, -c) 0
data seg size (kbytes, -d) unlimited
scheduling priority (-e) 0
file size (blocks, -f) unlimited
pending signals (-i) 8191
max locked memory (kbytes, -l) 32
max memory size (kbytes, -m) unlimited
open files (-n) 1024
pipe size (512 bytes, -p) 8
POSIX message queues (bytes, -q) 819200
real-time priority (-r) 0
stack size (kbytes, -s) 8192
cpu time (seconds, -t) unlimited
max user processes (-u) 8191
virtual memory (kbytes, -v) unlimited
file locks (-x) unlimited
mark@Cactus-Fantastico:~/Desktop$
keller
Posts: 11
Joined: Thu Nov 08, 2007 4:04 pm

Post by keller »

Same thing for me. It's not the filesystem or ulimit restrictions, it's got to be something different that HandBrake does. Not necessarily in HandBrake's code, but maybe in one of its dependencies.

Code: Select all

keller@KellerDesktop:~/dvdtemp$ dd if=/dev/zero of=testfile.bin bs=1024 count=2560000
2560000+0 records in
2560000+0 records out
2621440000 bytes (2.6 GB) copied, 70.1007 seconds, 37.4 MB/s
keller@KellerDesktop:~/dvdtemp$ ulimit -a
core file size          (blocks, -c) 0
data seg size           (kbytes, -d) unlimited
scheduling priority             (-e) 0
file size               (blocks, -f) unlimited
pending signals                 (-i) 16375
max locked memory       (kbytes, -l) 32
max memory size         (kbytes, -m) unlimited
open files                      (-n) 1024
pipe size            (512 bytes, -p) 8
POSIX message queues     (bytes, -q) 819200
real-time priority              (-r) 0
stack size              (kbytes, -s) 8192
cpu time               (seconds, -t) unlimited
max user processes              (-u) 16375
virtual memory          (kbytes, -v) unlimited
file locks                      (-x) unlimited
keller@KellerDesktop:~/dvdtemp$ 
rhester
Veteran User
Posts: 2888
Joined: Tue Apr 18, 2006 10:24 pm

Post by rhester »

I'm very confused about this one, because the error being thrown is most definitely an OS signal and not coming from HandBrake.

Can you both provide details about your operating system and environment (which version of which distribution, 32-bit or 64-bit, whether you are using the precompiled HandBrake 0.9.1 binary or you compiled your own, if the latter what version of gcc was used, and your kernel version per 'uname -a')?

Rodney
Polygon
Novice
Posts: 72
Joined: Wed Oct 24, 2007 1:36 pm

Post by Polygon »

Im using

Ubuntu Gutsy Gibbon (7.10)
32 bit os
kernel: Linux Cactus-Fantastico 2.6.22-14-generic #1 SMP Sun Oct 14 23:05:12 GMT 2007 i686 GNU/Linux

this is happening to me with the handbrake i compiled myself
gcc version: 4:4.1.2-9ubuntu2
keller
Posts: 11
Joined: Thu Nov 08, 2007 4:04 pm

Post by keller »

Looks like I have a similar setup.

Ubuntu 7.10 Gutsy Gibbon, running in 32-bit

Linux KellerDesktop 2.6.22-14-generic #1 SMP Sun Oct 14 23:05:12 GMT 2007 i686 GNU/Linux

I compiled my own HandBrake using SVN and jam.
gcc (GCC) 4.1.3 20070929 (prerelease) (Ubuntu 4.1.2-16ubuntu2)

This may very well be an issue confined to Gutsy. I'll fire up a Fiesty VM and see if I can replicate this in a previous version of Ubuntu.

Thanks again in advance for any help you can provide! :lol:

-Keller
rhester
Veteran User
Posts: 2888
Joined: Tue Apr 18, 2006 10:24 pm

Post by rhester »

The other thing I'd be keen on knowing is whether either of you have the same problem with the precompiled 0.9.1 binary.

Rodney
keller
Posts: 11
Joined: Thu Nov 08, 2007 4:04 pm

Post by keller »

Per one of my previous posts, precompiled binaries downloaded from this site do the same thing.
Post Reply