Skip to content

Incorrect touch point coordinates when running application on secondary screen with WM_Pointer enabled #8517

@lindexi

Description

@lindexi

Description

I am reporting an issue where incorrect touch point coordinates are obtained when calling the GetStylusPoint methods within an application that is moved to a non-primary screen on a multi-screen device with WM_Pointer enabled.

This issue was initially discovered by my friend @kkwpsv. I borrowed a touch-enabled display and set it as a secondary screen. I then monitored the StylusMove event and used the GetStylusPoint method to get the touch point after the event was triggered. I observed that the coordinates of the touch points I obtained were incorrect.

Concerned that it might be an implementation issue on my part, I switched to using the default InkCanvas control. I noticed that the coordinates of the traces drawn by the InkCanvas control were also offset. I recorded the screen as shown below. I have two screens, the left one is the primary screen and the right one is the secondary touch screen. I resized the window of the simplest demo application containing the InkCanvas control to span both screens. Then, I drew lines on the secondary screen using touch and observed that the InkCanvas control drew traces at incorrect coordinates.

触摸偏移

Reproduction Steps

Steps to reproduce:

  1. Enable WM_Pointer messages by AppContext.SetSwitch("Switch.System.Windows.Input.Stylus.EnablePointerSupport", true)
  2. Add InkCanvas control in MainWindow
  3. Run on a device with two screens, with the left screen set as the primary screen and the right screen as the touch screen

Please touch on the right screen, you can see that the coordinates of the traces drawn by the InkCanvas control are incorrect. Furthermore, you can monitor events like StylusMove and use methods like GetStylusPoint or GetIntermediateTouchPoints to get touch points, and you will see that the coordinates of the obtained touch points are incorrect.

And my demo code is https://github.com/lindexi/lindexi_gd/tree/8c16318d0599c47b2090feaf84e14a2444af829c/BenukalliwayaChayjanehall

Expected behavior

We can get the right touch coordinates.

Actual behavior

We will see that the coordinates of the obtained touch points are incorrect.

Regression?

All

Known Workarounds

Upon investigation by @kkwpsv, it was found that this issue is due to an error in the implementation of the GetOriginOffsetsLogical method in HwndPointerInputProvider.cs. In the GetOriginOffsetsLogical method, the PointToScreen method is directly used to calculate the relative coordinates of the origin. This calculation method is only correct when there is only one screen. In the case of multiple screens, it is necessary to subtract the top left corner of the DisplayRect of the screen where the window is located to get the expected origin value, which can then be used to convert the window coordinates to coordinates relative to the virtual screen.

/// <summary>
/// This function uses the logical origin of the current hwnd as the offsets for
/// logical pointer coordinates.
///
/// This is needed as WISP's concept of tablet coordinates is not the entire tablet.
/// Instead, WISP transforms tablet X and Y into the tablet context. This does not
/// change the max, min, or resolution, merely translates the origin point to the hwnd
/// origin. Since the inking system in WPF was based on this raw data, we need to
/// recreate the same thing here.
///
/// See Stylus\Biblio.txt - 7
///
/// </summary>
/// <param name="originOffsetX">The X offset in logical coordinates</param>
/// <param name="originOffsetY">The Y offset in logical coordiantes</param>
private void GetOriginOffsetsLogical(out int originOffsetX, out int originOffsetY)
{
Point originScreenCoord = _source.Value.RootVisual.PointToScreen(new Point(0, 0));
// Use the inverse of our logical tablet to screen matrix to generate tablet coords
MatrixTransform screenToTablet = new MatrixTransform(_currentTabletDevice.TabletToScreen);
screenToTablet = (MatrixTransform)screenToTablet.Inverse;
Point originTabletCoord = originScreenCoord * screenToTablet.Matrix;
originOffsetX = (int)Math.Round(originTabletCoord.X);
originOffsetY = (int)Math.Round(originTabletCoord.Y);

Thanks to @kkwpsv for his fix. He has corrected the code and posted it at dotnet-campus#9. I have merged his code for testing. I have tested and verified @kkwpsv's fix. After his fix, WPF with WM_Pointer enabled can correctly obtain touch point coordinates using the GetTouchPoint or GetStylusPoint methods.

You can download and experience the fixed version at https://www.nuget.org/packages/dotnetCampus.WPF.Resource/6.0.4-alpha07-test06 . The demo code that fixes this issue using the above version of the package is available at https://github.com/lindexi/lindexi_gd/tree/893292f260c4570ff63e68b9e0a29052a187d0c6/BenukalliwayaChayjanehall

Impact

This bug will affect all applications running on multiple touch displays that have WM_Pointer enabled.

Configuration

All

Other information

I will submit @kkwpsv's bug fix code to the WPF repository after more partners and devices have participated in the testing.

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions