There are already a lot of useful WPF UI controls that, with a few easy adjustments, can be used within a Surface application. Basically you only need to adjust mouse events to contact events.
In this blog post I cover how you can make 2 controls Surface ready, provide a quick walkthrough and the resulting code.
First is the WPF Book control from Mitsu Furuta that is available at Codeplex (http://www.codeplex.com/wpfbookcontrol) and is described in a blog post at http://blogs.msdn.com/mitsu/archive/2007/04/18/wpf-book-control.aspx .
The second are the 3D tools found under http://3dtools.codeplex.com/ .
WPF Book Control
To port the book control I used the Project WPFMitsuControls and renamed it to SFMitsuControl.
First thing to do is to add the necessary references Microsoft.Surface and Microsoft.Surface.Presentation.
The book is based on a WPF ItemsControl and the pages are WPF ContentControl’s. To make them Surface enabled we need to change them to the corresponding Surface controls. After importing the namespace with xmlns:s=”http://schemas.microsoft.com/surface/2008″, we make the following changes:
<ItemsControl will be changed into <s:SurfaceItemsControl and
<ContentControl will be changed to <s:SurfaceContentControl
Don’t forget to change the code behind in Book.cs and BookPage.cs to inherit from the corresponding base classes.
This enables us to the ContactDown, ContactChanged and ContactUp events that we will use to replace the corresponding mouse events of the BookPage. We also need to change the MouseDown on the BookPages inside the book control. And that’s it! The code is pretty much the same as with the mouse events. You only need to make small adjustments like capturing the contact instead of the mouse.
The resulting code can be downloaded here: Surface Book Control
I took the demo app as is and only made a surface window and changed the UserControl to SurfaceUserControl. I did not bother to make the Usercontrols completely Surface enabled.
The second thing I cover are the 3D Tools. I will go over updating the trackball and Trackport3D.
We could just do the same with the trackball that we already did with the book control, but there is one thing that would be bothering. The trackball is moved with the left mouse button and zooming is done with the right mouse button. Of course we can’t zoom like this on a Surface. We don’t have a right button and on top of that it would be totally not cool! We will use multitouch zoom to accomplish this.
Luckily, MS gave as the ManipulationProcessor class that will greatly reduce the effort to interpret touch gestures.
Again, we start by adding the references Microsoft.Surface and Microsoft.Surface.Presentation.
The first control we change will be the Trackball. We need to add a local variable for our manipulation processor:
private Affine2DManipulationProcessor manipulationProcessor;
And then change the initialization of the events and init the processor when the EventSource Property gets updated.
this.manipulationProcessor = new Affine2DManipulationProcessor(
manipulationProcessor.Affine2DManipulationDelta += new EventHandler<Affine2DOperationDeltaEventArgs>(manipulationProcessor_Affine2DManipulationDelta);
input.PreviewContactDown += new ContactEventHandler(input_PreviewContactDown);
input.PreviewContactUp += new ContactEventHandler(input_PreviewContactUp);
This creates a new manipulation processor that allows us to translate and scale. We hook up to the Affine2DManipulationDelta event that tells us the scale and translate deltas that we will use to update our transform object.
We also hook up to the ContactDown and ContactUp events. When a contact is recognized in the ContactDown event handler, we capture it and tell our manipulation processor to track the contact. From now on, any change to the contact will go through our processor, and results in a delta in scale or translation. The processor does all the hard work here and we only need to look at the deltas, to see our change. The delta event handler looks like this now:
if (e.ScaleDelta != 0.0)
_previousPosition2D.X = e.ManipulationOrigin.X – e.Delta.X;
_previousPosition2D.Y = e.ManipulationOrigin.Y – e.Delta.Y;
_previousPosition3D = ProjectToTrackball(EventSource.ActualWidth, EventSource.ActualHeight, _previousPosition2D);
It’s pretty simple and just reuses the Track and Zoom function we already have.
We also need to update our Trackport3D UserControl to a SurfaceUserControl. And we need to change how the Trackport3D uses the trackball. The old one used a border control to capture the mouse events. Since we need a contact enabled control to capture the events, we will use a SurfaceContentControl to achieve this.
And that’s it for the trackball control. The resulting code can be found here: Surface 3D Tools
As you can see, it’s very easy to build upon available WPF controls for your Surface application. The changes are straight forward and the ManipulationProcessor gives you a great way to easily implement touch gestures.
Great work Simon!
Does the modified library makes use of the special Surface gestures? Like Rotating and Zooming gestures?
The modification of the trackball uses the zooming gesture. This is easily done by leveraging the ManipulationProcessor that comes with the surface sdk. Other than that I did not used any gestues. The book only uses easy contact events and no manipulation processor.
hi. great article!
Hi, great job i just start working with this controls, i’m trying to put the book control into a scateviewitem for manipulations and inertia, but there´s a problem whith the events… it seems such as the scateviewitem take the control and the flip page is not working… any ideas or some code :)??
Great work on the controls. Im using this app to render my 3d controls on Surface. But I see something strange in the application as the zoom works inversely. That is, the shrink gesture actually zooms in and the enlarge gesture zooms out. Iven’t been able to fix that. Any help with that would be great.
Thanks again for the code.
Umm…WPFBook does nothing. Ran it alone and in the surface simulator and it responds to no finger input at all. Is it actually working?
It should work fine actually. Maybe you have something inside the pages that gets the touch event bevore they reach the actual book page. You might check out the handlers for ContactUp,-Down,-Leave and -Changed inside the BookPage.xaml.cs. If they don’t get fired, the routed event does not reach the page.
Thanks for the code you have made available.
I was trying to use the TrackballDecorator on the Surface and failing to do so. Any suggestions. Right now I can rotate and scale the entire scene but not individual models in the scene. I would like to rotate individual models on the Surface?
thanks a lot!
The trackball class I used, acts on the camera of the viewport. It cant be used to rotate each individual model.
Here is a description of how it works: http://viewport3d.com/trackball.htm
The same theory could possibly be applied to the each individual model, but it would need a different implementation.
thanks for your reply, really appreciate it. I read through your link and now understand how 2-D coordinates are mapped to 3D models.
But, I’m not a very skillful programmer. Do you have any suggestions on how to implement model rotation using the trackball class?
This is what I’m thinking:
Don’t change the trackball class at all.
Work on another version of TrackPort3D that rotates the model instead of the camera.
Any more hints? or suggestions? Am I heading in the right direction?
thanks a lot
Thanks for the excellent post. I am currently working on a Microsoft Surface Project and the modified book control is exactly what I was looking for. It worked like charm. However, When putting it inside of a ScatterViewItem (so that users can freely manipulate it), Pages never flip and all contact events are handled by the ScatterViewItem and translated as manipulations.
Can you help me with that?