Results 1 to 6 of 6

Thread: /var/log/Xorg.0.log - nvidia GLX error

Hybrid View

  1. #1
    Member
    Join Date
    Feb 2007
    Posts
    229

    Default /var/log/Xorg.0.log - nvidia GLX error

    has anyone else been getting this?

    (**) NVIDIA(0): Enabling RENDER acceleration
    (EE) NVIDIA(0): Failed to initialize the GLX module; please check in your X
    (EE) NVIDIA(0): log file that the GLX module has been loaded in your X
    (EE) NVIDIA(0): server, and that the module is the NVIDIA GLX module. If
    (EE) NVIDIA(0): you continue to encounter problems, Please try
    (EE) NVIDIA(0): reinstalling the NVIDIA driver.
    (II) NVIDIA(0): NVIDIA GPU GeForce 9600M GT (G96) at PCI:1:0:0 (GPU-0)

    not sure how much its affecting my system, but annoying nonetheless.

    i'm running the backtrack-nvidia package (nvidia-driver etc) and ran nvidia-xconfig to make my xorg.conf.

  2. #2
    Developer
    Join Date
    Mar 2007
    Posts
    6,126

    Default

    ok so to get direct rendering going on bt4 pre with nvidia you need to add these options to your xorg.conf under each driver section

    Section "Device"
    Identifier "Device0"
    Driver "nvidia"
    VendorName "NVIDIA Corporation"
    BoardName "GeForce 8800 GT"
    BusID "PCI:3:0:0"
    Screen 0
    Option "AddARGBGLXVisuals" "true"
    Option "Coolbits" "1"
    Option "RenderAccel" "true"
    EndSection

    Also add the composite section at the end.
    Section "Extensions"
    Option "Composite" "Enable"
    EndSection



    Then restart X

    To check if its working just apt-get install mesa-utils

    The run glxinfo | grep rendering to see if its working like this...

    root@bt:~# glxinfo |grep rendering
    direct rendering: Yes

    Should be all squared away after that

  3. #3
    Just burned his ISO monovitae's Avatar
    Join Date
    Jul 2008
    Posts
    24

    Default

    I have attempted to follow your instructions purehate, however after doing so and restarting X glxinfo | grep rendering returns 'Error : glXCreateContext failed'

    Heres my xorg.conf if you get a moment maybe you could point out the error of my ways?

    Code:
    # nvidia-xconfig: X configuration file generated by nvidia-xconfig
    # nvidia-xconfig:  version 1.0  (buildmeister@builder63)  Thu Apr 30 17:37:55 PDT 2009
    
    Section "ServerLayout"
        Identifier     "X.org Configured"
        Screen      0  "Screen0" 0 0
        InputDevice    "Mouse0" "CorePointer"
        InputDevice    "Keyboard0" "CoreKeyboard"
    EndSection
    
    Section "Files"
        ModulePath      "/usr/lib/xorg/modules"
        FontPath        "/usr/share/fonts/X11/misc"
        FontPath        "/usr/share/fonts/X11/cyrillic"
        FontPath        "/usr/share/fonts/X11/100dpi/:unscaled"
        FontPath        "/usr/share/fonts/X11/75dpi/:unscaled"
        FontPath        "/usr/share/fonts/X11/Type1"
        FontPath        "/usr/share/fonts/X11/100dpi"
        FontPath        "/usr/share/fonts/X11/75dpi"
        FontPath        "/var/lib/defoma/x-ttcidfont-conf.d/dirs/TrueType"
    EndSection
    
    Section "Module"
        Load           "extmod"
        Load           "xtrap"
        Load           "dbe"
        Load           "glx"
        Load           "record"
    EndSection
    
    Section "InputDevice"
        Identifier     "Keyboard0"
        Driver         "kbd"
    EndSection
    
    Section "InputDevice"
        Identifier     "Mouse0"
        Driver         "mouse"
        Option         "Protocol" "auto"
        Option         "Device" "/dev/input/mice"
        Option         "ZAxisMapping" "4 5 6 7"
    EndSection
    
    Section "Monitor"
        Identifier     "Monitor0"
        VendorName     "Monitor Vendor"
        ModelName      "Monitor Model"
    EndSection
    
    Section "Device"
    
            ### Available Driver options are:-
            ### Values: <i>: integer, <f>: float, <bool>: "True"/"False",
            ### <string>: "String", <freq>: "<f> Hz/kHz/MHz"
            ### [arg]: arg optional
            #Option     "SWcursor"               # [<bool>]
            #Option     "HWcursor"               # [<bool>]
            #Option     "NoAccel"                # [<bool>]
            #Option     "ShadowFB"               # [<bool>]
            #Option     "UseFBDev"               # [<bool>]
            #Option     "Rotate"                 # [<str>]
            #Option     "VideoKey"               # <i>
            #Option     "FlatPanel"              # [<bool>]
            #Option     "FPDither"               # [<bool>]
            #Option     "CrtcNumber"             # <i>
            #Option     "FPScale"                # [<bool>]
            #Option     "FPTweak"                # <i>
            #Option     "DualHead"               # [<bool>]
        Identifier     "Card0"
        Driver         "nvidia"
        VendorName     "nVidia Corporation"
        BoardName      "Unknown Board"
    EndSection
    
    Section "Screen"
        Identifier     "Screen0"
        Device         "Card0"
        Monitor        "Monitor0"
        Option "AddARGBGLXVisuals" "true"
        Option "Coolbits" "1"
        Option "RenderAccel" "true"
        SubSection     "Display"
            Viewport    0 0
        EndSubSection
        SubSection     "Display"
            Viewport    0 0
            Depth       4
        EndSubSection
        SubSection     "Display"
            Viewport    0 0
            Depth       8
        EndSubSection
        SubSection     "Display"
            Viewport    0 0
            Depth       15
        EndSubSection
        SubSection     "Display"
            Viewport    0 0
            Depth       16
        EndSubSection
        SubSection     "Display"
            Viewport    0 0
            Depth       24
        EndSubSection
    EndSection
    
    Section "Extensions"
        Option "Composite" "Enable"
    EndSection
    ~Monovitae~

  4. #4
    Developer
    Join Date
    Mar 2007
    Posts
    6,126

    Default

    well you put it under screens instead of device. I would get rid of all that other garbage and make it look like this. If that doesn't work we will investigate further.

    Section "Device"
    Identifier "Card0"
    Driver "nvidia"
    VendorName "nVidia Corporation"
    BoardName "Unknown Board"
    Option "AddARGBGLXVisuals" "true"
    Option "Coolbits" "1"
    Option "RenderAccel" "true"
    EndSection

  5. #5
    Just burned his ISO monovitae's Avatar
    Join Date
    Jul 2008
    Posts
    24

    Default

    Thanks for the quick reply. I have modified /etc/X11/xorg.conf to look like this.
    Code:
    Section "Device"
        Identifier "Card0"
        Driver "nvidia"
        VendorName "nVidia Corporation"
        BoardName "Unknown Board"
        BusID "PCI:1:0:0"
        Screen 0
        Option "AddARGBGLXVisuals" "true"
        Option "Coolbits" "1"
        Option "RenderAccel" "true"
    EndSection
    
    Section "Extensions"
        Option "Composite" "Enable"
    EndSection
    This results in the same error when running glxinfo | grep rendering.

    Removing
    Code:
        BusID "PCI:1:0:0"
    
    and
    
    Section "Extensions"
        Option "Composite" "Enable"
    EndSection
    Making it look exactly like your post also has no effect.

    Other new information I have come across that may or may not be useful. When stopping x the console shows the following.
    Code:
    (EE) NVIDIA(0): Failed to initialize the GLX module; please check in your X log file that the Glx module has been loaded in your X server, and that the module is the NVIDIA GLX module. If you encouter problems please try reinstalling the NVIDIA driver.
    Upon reinstalling the driver via
    Code:
     apt-get install -reinstall nvidia-driver
    Starting X fails with an error to the effect that the number of screens don't match. This can be rectified so you can actually start X by manually modifing the xorg.conf to what I have listed above or by running the nvidia configuration utilitiy.

    I also poked around in /var/log/Xorg.0.log as the error suggested and the only thing that seemed possibly relevent to me was
    Code:
    (II) LoadModule: "glx"
    (II) Loading /usr/lib/xorg/modules/extensions//libglx.so
    (II) Module glx: vendor="X.Org Foundation"
    Just seemed like a possible culprit because the error wanted it to use the nvidia glx, but I could definitely be barking up the wrong tree here.

    Anyway thats all I know for now if theres any other info I can provide to help such as the rest of my /var/log/Xorg.0.log or somethign let me know I just didn't want to make a giant post full of useless crap.

    Thanks again not only for the rapid response on this issue but for all your BT 4 effort in general. Aside from this little quirk everything has been working wonderfully for me with pre-final, and this problem here is not something I would describe as a 'necessary' feature by any means.
    ~Monovitae~

  6. #6
    Just burned his ISO monovitae's Avatar
    Join Date
    Jul 2008
    Posts
    24

    Default

    Some further investigation into my issue has revealed the following. If I remove nivida-driver via apt-get remove then reinstall. Then modify xorg.conf to read.

    Code:
    Section "Device"
            Identifier  "Card0"
            Driver      "nvidia"
            VendorName  "nVidia Corporation"
            BoardName   "Unknown Board"
            Option "AddARGBGLXVisuals" "true"
            Option "Coolbits" "1"
            Option "RenderAccel" "true"
    EndSection
    
    Section "Extensions"
            Option "Composite" "Enable"
    EndSection
    With this setup glxinfo | grep rendering is successful and glxgears works wonderfully. So proceeding from there installing cpyrit-cuda appears successful, and rendering is still functioning however when running pyrit list_cores the gpu is not seen and is not utilized in the benchmark. I tried giving
    Code:
    apt-get install --reinstall pyrit cpyrit-cuda
    a shot, but it didn't change the situation.

    The piviot point in this whole affair appears to be when installing 'cuda-toolkit'. If that is installed the situation flips to glxinfo | grep rendering returning Error: glXCreateContext failed. However pyrit then fuctions as expected identifying the gpu in list_cores and benchmarking properly.

    So it would seem that cuda-toolkit is overwriting something essential. I've tried. Preforming an 'apt-get install --reinstall nvidia-driver' after I've installed cuda-toolkit but it dosen't seem to change anything as in pyrit works rendering dosen't.
    ~Monovitae~

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •