Java on Linux: getDisplayMode() fails in multi screen

I have a desktop java application (java 1.4.2), that needs to determine the information regarding two screens on linux environment:

# cat /etc/redhat-release
Red Hat Enterprise Linux WS release 4 (Nahant Update 7)
# lsb_release
cat /proc/versionLSB Version:   
:core-3.0-ia32:core-3.0-noarch:graphics-3.0-ia32:graphics-3.0-noarch
# cat /proc/version
Linux version 2.6.9-78.ELsmp (brewbuilder@hs20-bc2-3.build.redhat.com) 
(gcc version 3.4.6 20060404 (Red Hat 3.4.6-10)) #1 SMP Wed Jul 9 15:39:47 EDT 2008

and the screens are 2048x2048 and 1600x1200.

The code is

GraphicsEnvironment env = GraphicsEnvironment.getLocalGraphicsEnvironment();
GraphicsDevice[] allScreens = env.getScreenDevices();
log("=============================================");
log("Total num. of screen = " + allScreens.length);
for (int i = 0; i < allScreens.length; i++) {
    log("--------------------------------------");

    log(
        allScreens[i].getIDstring() + " width: " + allScreens[i].getDisplayMode().getWidth() + 
        " - height: " + allScreens[i].getDisplayMode().getHeight());

    GraphicsConfiguration dgc =
        allScreens[i].getDefaultConfiguration();
    Rectangle bounds = dgc.getBounds();
    Insets insets = Toolkit.getDefaultToolkit().getScreenInsets(dgc);
    log("Bounds: " + bounds);
    log("Insets: " + insets);

    log("--------------------------------------");
}
log("=============================================");

but the output is

=============================================
Total num. of screen = 2
--------------------------------------
:0.0 width: 2048 - height: 2048
Bounds: java.awt.Rectangle[x=0,y=0,width=2048,height=2048]
Insets: java.awt.Insets[top=0,left=0,bottom=0,right=0]
--------------------------------------
--------------------------------------
:0.1 width: 2048 - height: 2048
Bounds: java.awt.Rectangle[x=0,y=0,width=1600,height=1200]
Insets: java.awt.Insets[top=0,left=0,bottom=0,right=0]
--------------------------------------
=============================================

the screen :0.1 is 2048x2048 when using allScreens[i].getDisplayMode(), and is 1600x1200 when using getDefaultConfiguration().getBounds():

why I have different results ?

The API code for getDisplayMode() is

public DisplayMode getDisplayMode() {
    GraphicsConfiguration gc = getDefaultConfiguration();
    Rectangle r = gc.getBounds();
    ColorModel cm = gc.getColorModel();
    return new DisplayMode(r.width, r.height, cm.getPixelSize(), 0);
}

so the values should be the same: why are different ?

Thanks

Answers


That is something I also noticed on my own application which requires the front gui to fit in to different monitors in a multi-monitor environment. The problem is something related to the video card, an Intel video card will present you the same width and height for different monitors, basically the primary's, if you use allScreens[i].getDisplayMode(), whereas the NVidia and ATI(AMD) ones give you the actual resolution values corresponding to each monitor using the same function.

So the proper way to get the right resolution for each monitor in a multiple-monitor environments regardless of video card is to use getDefaultConfiguration().getBounds().width or height.

Hope it helps.


Need Your Help

Http OPTIONS request gets completely ignored by Play route config

playframework playframework-2.0

I'm using Play 2.2.1. I have the following route configuration in my route file:

ActionDispatch::Http::Parameters#encode_params tries to modify frozen strings and raises on upgrade from Rails 3.0 to Rails 3.2.16

ruby-on-rails ruby-on-rails-3 ruby-on-rails-3.2

Basically, this seems to be https://github.com/rails/rails/issues/7725 reported a year ago by a guy who stopped responding (heh http://xkcd.com/979/)