Page 1 of 3 123 LastLast
Results 1 to 10 of 24

Thread: Pyrit Cuda Nvidia Optimus Power Drain

  1. #1
    Senior Member
    Join Date
    Jul 2011
    Posts
    236

    Default Pyrit Cuda Nvidia Optimus Power Drain

    Ladies/Gents,

    I've been wanting to do this post for some time now, but haven't really had the motivation until I read this thread here. This isn't a means to jack that thread in any way shape or form; W1K3D, you did a nice and detailed job describing your issue, and asking for guidance. It's just that I'm going to go pretty in depth here and it will shy away from the original intent of your thread. 'Nuff said.

    This will be a multi-threaded post intended to address the issues I have encountered with Dual GPU architectures, also known as Optimus Switching Technologies. You will see this mainly in laptops with an Intel GPU and a Nvidia GPU. A lot of the information I have found out deals with laptops that only have one GPU, not this newer type of technology.

    If you are curious of the title I made, it's so that this topic is very searchable. It deals with every item I will discuss on this thread.

    On the flip side, this post really belongs under the how-to section, for whatever reason though, I do not have the privileges to post there. Devs, please move this post if able.

    I have found much of my info all around the web, not just in a central location. Some of the locations I found had half here and others had half there. No one source had all the information I sought. It took a great deal of searching for this gouge. I will credit the authors where due.

    Rig:
    Alienware M11Xr3 (64-bit) I5 with Intel GPU and Nvidia GT540 gfx card (using "Optimus" switching technologies).
    Back|Trackr1 (Gnome) 64-bit

    Part I

    Installation of Nvidia GPU accelerated tools
    Steps:
    1. Per Source 1 apt-get install libssl-dev scapy python-dev
      Note--> I do not have python-dev installed on my rig, however, pyrit works just fine for me with the addition of a few errors (non show-stoppers) on the output that I will post at a later time. Perhaps the python-dev .deb is the solution. Will advise when able.
    2. Per Source 2
      1. Download Nvidia drivers according to your CPU architecture:
        • 32 bit: wget http://developer.download.nvidia.com/compute/cuda/4_0_rc2/drivers/devdriver_4.0_linux_32_270.40.run
        • 64 bit: wget http://developer.download.nvidia.com/compute/cuda/4_0_rc2/drivers/devdriver_4.0_linux_64_270.40.run

        Note--> There are newer drivers available, however I have not yet tested them. If you decide to use these please comment on your results.
        • 32 bit: wget http://us.download.nvidia.com/XFree86/Linux-x86/295.20/NVIDIA-Linux-x86-295.20.run
        • 64 bit: wget http://us.download.nvidia.com/XFree86/Linux-x86_64/295.20/NVIDIA-Linux-x86_64-295.20.run

      2. Log out of the screen manager.
      3. Do:
        Code:
        modprobe -r nouveau
      4. Launch the Nvidia driver file, making sure that you install the 32-bit library option when asked.
        Note-->There is an option to have Nvidia update your xorg.conf file, I disregard this option due to the fact that I have not yet gotten Nvidia to work with my screen manager just yet. More to come on this later. If you choose to have Nvidia update the file and a subsequent startx attempt fails, simply delete or rename /etc/X11/xorg.conf, and try again (it works).
      5. Re-log into your screen manager and download the CUDA toolkit, according to your CPU architecture:
        • 32 bit: wget http://www.nvidia.com/object/thankyou.html?url=/compute/cuda/4_0_rc2/toolkit/cudatoolkit_4.0.13_linux_32_ubuntu10.10.run
        • 64 bit: wget http://www.nvidia.com/object/thankyou.html?url=/compute/cuda/4_0_rc2/toolkit/cudatoolkit_4.0.13_linux_64_ubuntu10.10.run
      6. Install cuda to /opt
      7. Append the following lines in /root/.bashrc:
        Code:
        PATH=$PATH:/opt/cuda/bin
        LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/opt/cuda/lib64:/opt/cuda/lib
        export PATH
        export LD_LIBRARY_PATH
        
        Note--> If you are rocking 32-bit, then use LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/opt/cuda/lib
      8. Then run:
        Code:
        source /root/.bashrc
        ldconfig
      9. Verify via:
        Code:
        which nvcc
        nvcc -V
      10. Lastly do:
        Code:
        cd ~
        svn checkout http://pyrit.googlecode.com/svn/trunk/ pyrit
        cd pyrit/pyrit && python setup.py build && python setup.py install
        cd ../../
        cd pyrit/cpyrit_cuda && python setup.py build && python setup.py install
        cd ~
        rm -rf pyrit
        modprobe -r nouveau
        modprobe nvidia
        pyrit benchmark
      11. To implement your newly installed GPU capabilities you must do the following each time a reboot occurs and you wish to run pyrit or any other type of GPU accelerated tools.
        Code:
        modprobe -r nouveau
        modprobe nvidia

    Congratulations, you now have GPU accelerated capabilities, even with Optimus technologies!

    In my next post I will do a detailed walk-through on how to dramatically lower your power consumption on Optimus style boxes.
    V/r,
    Snafu
    Pffbt..[quote]I made a discovery today. I found a computer. Wait a second, this is cool. It does what I want it to. If it makes a mistake, it's because I screwed it up. Not because it doesn't like me... Or feels threatened by me.. Or thinks I'm a smart ass.. [/quote]

  2. #2
    Just burned his ISO
    Join Date
    Feb 2012
    Posts
    14

    Default Re: Pyrit Cuda Nvidia Optimus Power Drain

    snafu777,

    This greatly helped me to install and configure my GPU, I took a screenshot of the benchmark and was wondering if the results I received are in fact the ones that I have been looking for.

    Many Thanks,

    W1K3D

    Pyrit Benchmark.jpg

  3. #3
    Just burned his ISO
    Join Date
    Mar 2012
    Posts
    1

    Default Re: Pyrit Cuda Nvidia Optimus Power Drain

    Hey,

    Thank you a lot for your wonderful How-To.

    I (try) to do all the stuff you wrote.
    Now, if i try to start the nVidia X-Server settings,
    he says:

    You do not appear to be using the NVIDIA X driver. Please edit your X configuration file (just run `nvidia-xconfig` as root), and restart the X server.

    Okay, as he wish, i let him write a new xorg.conf, log out and try again with startx.
    But now he tells me, that there are no screens found.
    If i delete this xorg.conf again, i can startx, but still no possibility to get a higher resolution.

    Any solution?

  4. #4
    Just burned his ISO
    Join Date
    Feb 2012
    Posts
    6

    Default Re: Pyrit Cuda Nvidia Optimus Power Drain

    after installing nvidia driver it says that couldn't load the nvidia kernel's module.
    Do i have to install CUDA even if it doesn't recognize the nvidia screen ?

  5. #5
    Senior Member
    Join Date
    Jul 2011
    Posts
    236

    Default Re: Pyrit Cuda Nvidia Optimus Power Drain

    W1K3D,

    Thats a very close benchmark to myself. Nice job and congratz =)
    V/r,
    Snafu
    Pffbt..[quote]I made a discovery today. I found a computer. Wait a second, this is cool. It does what I want it to. If it makes a mistake, it's because I screwed it up. Not because it doesn't like me... Or feels threatened by me.. Or thinks I'm a smart ass.. [/quote]

  6. #6
    Senior Member
    Join Date
    Jul 2011
    Posts
    236

    Default Part II

    Ladies/Gents,

    This will be the 2nd part of the tutorial for tweaking your Back|Track up.... In this part we will be using the proggie called bbswitch to enable us to turn OFF the Nvidia driver allowing for a HUGE reduction in power consumption; a very important tweak for laptop users like myself who try not to nuke our battery when we're on battery power. Results may vary by box. My consumption went from an average of 24 watts on battery power down to 11 watts; more than half! Without further a due, lets foo this up.
    Part II
    Installation of bbswitch
    Steps:
    1. Find your current power consumption with the power cable disconnected via the following:
      Code:
      cat /proc/acpi/battery/BAT1/state | grep rate
      Note --> The BAT is dependent on the make and model of the laptop, if BAT1 doesnt exist, try BAT0, etc....
    2. A must have for a lot of things...
      Code:
      apt-get install dkms
    3. Code:
      wget https://launchpadlibrarian.net/90456284/bbswitch-dkms_0.4.1-1_all.deb
      dpkg -i bbswitch-dkms_0.4.1-1_all.deb
      rm bbswitch-dkms_0.4.1-1_all.deb
    4. bbswitch has now been installed. Time to activate it. You could of course script something up to do the following, but i prefer manual so here we go. Remember, The following must be done after each reboot to re-enable bbswitch:
      Code:
      modprobe -r nouveau
      modprobe bbswitch
      tee /proc/acpi/bbswitch <<<OFF
    5. Check the battery consumption after a few seconds via:
      Code:
      cat /proc/acpi/battery/BAT1/state | grep rate
      Note--> If the card stays on when trying to disable it, you've probably forgotten to unload the nouveau driver. You can check this via:
      Code:
      dmesg | tail -1
      If that was the case redo:
      Code:
      modprobe -r nouveau
      modprobe bbswitch
      tee /proc/acpi/bbswitch <<<OFF
    6. To turn back on the Nvidia GPU for running programs that can use GPU accelerated capabilities, simply do:
      Code:
      tee /proc/acpi/bbswitch <<<ON
    7. Congratulations, bbswitch is now installed and running!


    I don't remember off the top of my head what I had envisioned for Part III of this thread. I'll sit on it for a bit and get back at it here in a day or so.


    Cheers!
    V/r,
    Snafu
    Pffbt..[quote]I made a discovery today. I found a computer. Wait a second, this is cool. It does what I want it to. If it makes a mistake, it's because I screwed it up. Not because it doesn't like me... Or feels threatened by me.. Or thinks I'm a smart ass.. [/quote]

  7. #7
    Just burned his ISO
    Join Date
    Feb 2012
    Posts
    14

    Default Re: Pyrit Cuda Nvidia Optimus Power Drain

    snafu777,

    Today I realized that when I go to log out and it brings me back to the CLI screen where you type in "startx" that there is this error appearing: (EE) failed to initialize GLX Extension (Compatible NVIDIA X driver not found) I just upgraded to the 3.2.6 kernel. Would upgrading to the new kernel cause me to have to reinstall the drivers? Upon boot I have three options BT5 3.2.6 kernel, BT5 2.x.x kernel and Windows 7 Loader. When I do a modprobe nvidia it states the module nvidia not found. I am going to reinstall the drivers following your instructions in the 3.2.6 kernel.

  8. #8
    Just burned his ISO
    Join Date
    Feb 2012
    Posts
    14

    Default Re: Pyrit Cuda Nvidia Optimus Power Drain

    I have answered my own question within minutes of posting, the 3.2.6 kernel does not seem to work with the GPU, I ran a pyrit benchmark in the 3.2.6 kernel and it used the cpu, upon rebooting into the 2.x.x kernel and running the pyrit benchmark it utilized the GPU. Just giving a heads up on this.

  9. #9
    Just burned his ISO
    Join Date
    Mar 2012
    Posts
    5

    Default Re: Pyrit Cuda Nvidia Optimus Power Drain

    Isnt it better to use Bumblebee?
    it does all the work by itself....and has the bbswitch as well .

  10. #10
    Senior Member
    Join Date
    Jul 2011
    Posts
    236

    Default Re: Pyrit Cuda Nvidia Optimus Power Drain

    W1K3D,

    The instructions I posted work well for the new kernel, just point to the new drivers I posted about, I've tested them and they work diggity...
    32 bit: wget http://us.download.nvidia.com/XFree8...x86-295.20.run
    64 bit: wget http://us.download.nvidia.com/XFree8..._64-295.20.run
    V/r,
    Snafu
    Pffbt..[quote]I made a discovery today. I found a computer. Wait a second, this is cool. It does what I want it to. If it makes a mistake, it's because I screwed it up. Not because it doesn't like me... Or feels threatened by me.. Or thinks I'm a smart ass.. [/quote]

Page 1 of 3 123 LastLast

Similar Threads

  1. Nvidia Optimus problem
    By s0me0ne in forum BackTrack 5 Beginners Section
    Replies: 8
    Last Post: 07-17-2011, 02:48 PM
  2. Nvidia 64 drivers / Cuda and pyrit
    By vvpalin in forum BackTrack 5 Beginners Section
    Replies: 7
    Last Post: 07-07-2011, 12:11 AM
  3. Nvidia optimus
    By amithiel in forum Beginners Forum
    Replies: 2
    Last Post: 02-10-2011, 09:26 AM
  4. pyrit with CUDA support not showing CUDA cores
    By kill_box001 in forum Beginners Forum
    Replies: 7
    Last Post: 08-14-2010, 06:29 PM
  5. pyrit CUDA nvidia Tutorial + Nvidia overclock instructions
    By purehate in forum OLD BT4beta HowTo's
    Replies: 155
    Last Post: 01-26-2010, 12:27 AM

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •