important stuff
Jun 29, 2012


If you found this, you probably have a ThinkPad with an NVIDIA Optimus card and know that NVIDIA doesn't support this under Linux. There are a few workarounds on how to use an external monitor with Optimus under Linux and this is the method that works best for me. This setup will enable you to use an external monitor without having to restart the X server and will make compositing (desktop effects) work while using two monitors. Also, you will be able to update your system without breaking this setup.

This guide is adapted for Archlinux from Optimal Ubuntu Graphics Setup for Thinkpads by Sagar Karandikar.
For background information (or to read about setting this up on Ubuntu/Debian) read that post.


  1. follow the instructions on the archlinux Wiki on Bumblebee and get a basic bumblebee setup including bbswitch working. Ignore the part about multiple Monitors. (Working means, one monitor, bbswitch and optirun do what they are supposed to)
  2. Configure your setup to use the nvidia driver, bbswitch and bumblebeed (and have it still work...).
    this MAY even work with nouveau drivers, but I didn't test this and the configs I use here are set to the nvidia driver.
  3. Make sure to either delete your /etc/X11/xorg.conf or remove anything related to the NVIDIA card and multiple monitors from it.
  4. Install xf86-video-intel-virtual-crtc and screenclone-git from the AUR
  5. Download xorg.conf.nvidia and replace /etc/bumblebee/xorg.conf.nvidia with it (use sudo) YOU WILL HAVE TO DO THIS AGAIN IF YOU UPDATE THE BUMBLEBEE PACKAGE This is because setting a custum config file does not seem to work at the moment :/
  6. restart X. (Or reboot, just to be sure...)

Now you should be able to use following script to activate your external monitor :)
' know that script is kinda hacky and there are better ways to achieve what it does, but it does the job for me. If anyone comes up with something a little more sophisticated (disper might be an option, maybe turn this into a daemon), feel free to post it.

#change values below to your defaults
if [ -n "$1" ]
if [ -n "$2" ]
#generate a modeline from x/y
modeline=`cvt $x $y | sed "1d" | sed 's/Modeline //'`
mode=`echo $modeline | sed 's/ .*//'`

#create the mode and ignore xrandr error if the mode is already there
xrandr --newmode $modeline &> /dev/null 2>&1
#add the mode to the other display
xrandr --addmode VIRTUAL $mode
#activate external monitor and set options for it
xrandr --output LVDS1 --auto --output VIRTUAL --mode $mode --left-of LVDS1
#run screenclone in optirun. this way the NVIDIA card will automatically start and shut down. kill with ctrl+c
optirun screenclone -s $DISPLAY -d :8 -x 1
#deactivate monitor after screenclone is killed (kill this script with ctrl+C)
xrandr --output VIRTUAL --off

On running this script (run it in your Desktop Environment) the NVIDIA card will activate and your second monitor should work (after a few seconds). The screen and the NVIDIA card will be active for as long as the script is running and shut down afterwards. You have to set the resolution for the external monitor yourself for this to work. For example, if you want a resolution of 1920x1080, you would call

./monitor 1920 1080

If you call it without parameters, it will default to 1920x1080 with the screen being left of your notebook display, but you can easily change that in the script, everything that xrandr can do should work.

comments powered by Disqus