Hello,
I am trying to modify how touch input gets handled on a surface pro device that is running windows 7 in VMWare.
What I am doing is running my WPF application as a transparent canvas, and Based on when the user touches, I simulate a mouse click in a target application.
I do that by getting a window name through SPY++ , then I get the window Handle with "MainWindowHandle" and "SetForegroundWindow" to bring it to the front before I
send a "SendMessage"/"mouse_event" and then Activate the mainWindow of my application to bring mine on top;
Its kinda like an automated alt-tab , click, alt-tab back, where I just convert a touch input to a mouse click.
The problem is that while this works with multiple windows and applications, ranging from mspaint to Virtual box, it does not work for VMWare.
I m fairly confident at this point that it is not a logical problem of how and when I bring a window in a position to receive a click, or how I try to click (tried multiple methods).
So I assume it's some form of input blocking or something else from VMWare that is the culprit here.
Any suggestions on settings to change/code additions that would make this work?