Android multitouch gesture detectors

Android has a ScaleGestureDetector since API level 8 (Android 2.2). From this point on the first Android devices with multitouch screens appeared. This was great, and the ScaleGestureDetector too ofcourse.

What’s a mistery to me however, is why this is the only multitouch gesture detector that is in the API! Why isn’t there something like a RotateGestureDetector to find out rotations between two (or more) fingers… Is it one of the issues in the patent war between Apple and Google? I don’t know, but I want to be able to use it anyway… so I built this small extendable framework and I’ll explain in this article how you can use it in your Android apps! Now hope I won’t get sued for building the obvious :shock: !

Why I built this small framework

When I googled for solutions to read out rotations between two fingers on the screen, I found many solutions from people dealing with the same situation, but all not so much Object Oriented (no interfaces, abstract classes, etc) and therefor more difficult to reuse and extend.

Also, I imagine that someday (when the patent wars are over ;) ) Google will add something like a RotationGestureDetector to the API in the same way the ScaleGestureDetector is working already. At that point I would like to only change the import statement and not my whole activity implementation (no guarantees here ;) ). This is why I wanted to build this structure of gesture detectors. You can download (zip) or clone the framework, including an example Android app, right here from Github. Import it in your Eclipse workspace and off you go.

How to use Android’s ScaleGestureDetector

Let’s start with a tutorial on how to use the ScaleGestureDetector in your Activity…

Basic structure

In your Activity class you implement the android.view.View.OnTouchListener which makes you implement the onTouch(...) method. This way the Android system is able to inform the class any android.view.MotionEvent is happening.

public class TouchActivity extends Activity implements OnTouchListener {

	public void onCreate(Bundle savedInstanceState) {
		// Init stuff here
		...
	}

	public boolean onTouch(View v, MotionEvent event) {
		// Handle touch events here
		...
	}

	...
}

Now it is possible for you to read out the multitouch events in the onTouch(...) method and handle everything… but trust me, you’ll end up with a clutter of code which will be hard for you or your collegues to maintain! Also, the android.view.ScaleGestureDetector is here already for you to use anyway if you’re targeting API level 8 or higher, which you probably are at the time you are reading this article! Let’s delegate the eventhandling to the ScaleGestureDetector!

public class TouchActivity extends Activity implements OnTouchListener {

	private ScaleGestureDetector mScaleDetector;

	public void onCreate(Bundle savedInstanceState) {
		mScaleDetector = new ScaleGestureDetector(getApplicationContext(), new ScaleListener());
		...
	}

	public boolean onTouch(View v, MotionEvent event) {
		mScaleDetector.onTouchEvent(event);
	}

	...
}

Ofcourse first we have to create an instance of the ScaleGestureDetector in the onCreate(...) method. Via the local variable mScaleDetector we delegate the MotionEvent to the mScaleDetector.onTouchEvent(event). Great! Now the ScaleGestureDetector can figure out for us if scaling is performed. It’s of our sleaves!… Oh wait, how do we handle a user’s scale action than, just so we can do something with it in our Activity?

Handling events with a Listener

Maybe you’ve noticed the initialisation of the ScaleListener when we inited our ScaleGestureDetector. This listener is our way to know when a scale event has happened and what scaling information is available. Let’s implement this ScaleListener now as a private inner class:

public class TouchActivity extends Activity implements OnTouchListener
	...

	private class ScaleListener extends ScaleGestureDetector.SimpleOnScaleGestureListener {
		@Override
		public boolean onScale(ScaleGestureDetector detector) {
			mScaleSpan = detector.getCurrentSpan(); // average distance between fingers
			return true;
		}
	}

}

The ScaleListener class is an implementation of ScaleGestureDetector‘s inner interface SimpleOnScaleGestureListener. This way the ScaleGestureDetector ‘knows’ how to call our implementation when a scale event happened. In the onScale(...) method, we can read out the data we need from the given ScaleGestureDetector.

Using the scale data

Since the ScaleListener is an inner class, we can use the same local variables as in our Activity class to store our results for later use. This could be your action rendering something… you guessed… scaled on the screen ;) . A good place for this could be the onTouch(...) method again.

public class TouchActivity extends Activity implements OnTouchListener

	private float mScaleSpan = 1.0f;
	private ScaleGestureDetector mScaleDetector;

	...

	public boolean onTouch(View v, MotionEvent event) {
                mScaleDetector.onTouchEvent(event);

		// ScaleDetector handled event at this point.
		// Perform your magic with mScaleSpan now!
		...

		return true; // indicate event was handled
	}

	private class ScaleListener extends ScaleGestureDetector.SimpleOnScaleGestureListener {
		@Override
		public boolean onScale(ScaleGestureDetector detector) {
			mScaleSpan = detector.getCurrentSpan(); // average distance between fingers
			return true; // indicate event was handled
		}
	}

}

Preserving state between gestures

Now, detector.getCurrentSpan() always returns the distance between two finger touches. This means that if for example you want to scale a photo on the screen, you can’t scale it with multiple scale gestures following up eachother. The user must scale the photo in one gesture! That’s not the usability we want to offer in this case. We want to preserve the previous scale state so the user can ‘add’ additional scaling to it. This is something you can change in the listener. We can use the ScaleGestureDetector.getScaleFactor() here. And to preserve the state, use *= instead of =.

public class TouchActivity extends Activity implements OnTouchListener

	private float mScaleFactor = 1.0f;
	...

	private class ScaleListener extends ScaleGestureDetector.SimpleOnScaleGestureListener {
		@Override
		public boolean onScale(ScaleGestureDetector detector) {
			mScaleFactor *= detector.getScaleFactor(); // scale change since previous event
			return true; // indicate event was handled
		}
	}

}

Almeros’s multitouch gesture detectors

OK, now you know how you can setup your activity in an Object Oriented way with the standard multitouch gesture detector in the Android API. The gesture detectors I introduce can be used in the same way. I’ll start by explaining what the two different gesture detectors detect ;) . Also, note that these multitouch gesture detectors are still working with more than two fingers on the screen!

RotateGestureDetector

Two fingers on the screen represent a line between them. In a previous (first) onRotate event an angle was determined towards the normal. In the current event the RotateGestureDetector is able to determine the angle difference between the previous and the current event’s lines.

Only when two fingers are on screen, this angle can be determined ofcourse, so only then the listener will receive a call to the onRotate(...) method. At this point you can read out the current angle difference.

Same as above with the ScaleGestureDetector, we want to preserve the rotation state so a user can rotate (an image) with multiple progressing rotation gestures. Use -= instead of using =.

public class TouchActivity extends Activity implements OnTouchListener {

    private float mRotationDegrees = 0.f;
	...

	private class RotateListener extends RotateGestureDetector.SimpleOnRotateGestureListener {
		@Override
		public boolean onRotate(RotateGestureDetector detector) {
			mRotationDegrees -= detector.getRotationDegreesDelta();
			return true;
		}
	}	

	...
}

MoveGestureDetector

This one is actually more a convenience gesture detector. You could do very well without this one but to keep your code clear, more general and Object Oriented I think this one deserves its spot in the framework.

And again… to preserve the moved state between multiple gestures we use a delta distance from the previous event till the current one. If you don’t want that, you can use MoveGestureDetector.getFocusX() and MoveGestureDetector.getFocusY().

public class TouchActivity extends Activity implements OnTouchListener {

    private float mFocusX = 0.f;
	private float mFocusY = 0.f;
	...

	private class MoveListener extends MoveGestureDetector.SimpleOnMoveGestureListener {
		@Override
		public boolean onMove(MoveGestureDetector detector) {
			PointF d = detector.getFocusDelta();
			mFocusX += d.x;
			mFocusY += d.y;		

			// mFocusX = detector.getFocusX();
			// mFocusY = detector.getFocusY();
			return true;
		}
	}	

	...
}

All together now… To-geee-ther!

Yes you may sing along ;) . You can combine all these gesture detectors to give a user total control over the object on his/her screen.

...
import com.almeros.android.multitouch.gesturedetectors.MoveGestureDetector;
import com.almeros.android.multitouch.gesturedetectors.RotateGestureDetector;

public class TouchActivity extends Activity implements OnTouchListener {
	...

    private float mScaleFactor = 1.0f;
    private float mRotationDegrees = 0.f;
    private float mFocusX = 0.f;
    private float mFocusY = 0.f;  

    private ScaleGestureDetector mScaleDetector;
    private RotateGestureDetector mRotateDetector;
    private MoveGestureDetector mMoveDetector;

	@Override
	public void onCreate(Bundle savedInstanceState) {
		...

		// Setup Gesture Detectors
		mScaleDetector = new ScaleGestureDetector(getApplicationContext(), new ScaleListener());
		mRotateDetector = new RotateGestureDetector(getApplicationContext(), new RotateListener());
		mMoveDetector = new MoveGestureDetector(getApplicationContext(), new MoveListener());
	}

	public boolean onTouch(View v, MotionEvent event) {
	        mScaleDetector.onTouchEvent(event);
        	mRotateDetector.onTouchEvent(event);
	        mMoveDetector.onTouchEvent(event);

        	// Mmmmmhhhagic!!!
        	//  with: mScaleFactor, mRotationDegrees, mFocusX and mFocusY 
		...

		return true; // indicate event was handled
	}

	private class ScaleListener extends ScaleGestureDetector.SimpleOnScaleGestureListener {
		@Override
		public boolean onScale(ScaleGestureDetector detector) {
			mScaleFactor *= detector.getScaleFactor(); // scale change since previous event
			return true;
		}
	}

	private class RotateListener extends RotateGestureDetector.SimpleOnRotateGestureListener {
		@Override
		public boolean onRotate(RotateGestureDetector detector) {
			mRotationDegrees -= detector.getRotationDegreesDelta();
			return true;
		}
	}	

	private class MoveListener extends MoveGestureDetector.SimpleOnMoveGestureListener {
		@Override
		public boolean onMove(MoveGestureDetector detector) {
			PointF d = detector.getFocusDelta();
			mFocusX += d.x;
			mFocusY += d.y;		

			return true;
		}
	}		

}

Extending the framework with your own GestureDetectors

In the next article I’m going to write, I’ll explain how this framework is now built up and how you can extend it. I’ll also place the code with a simple test application in Github so you can all send me pull requests with your extentions!

I hope you like my efforts and I hope you enjoyed reading this tutorial. Cheers to multitouch on Android!

Tags

Bookmark the permalink. Post a comment or leave a trackback: Trackback URL.

18 Comments

  1. Tore Rudberg
    Posted November 12, 2012 at 12:35 PM | Permalink

    Hi!
    Excellent guide and GesrureDetector, made me brave enough to tackle implementing my own GestureDetector (I had some other requirements for mine). I think you should add the neat methods getFocusX and getFocusY, that are present in OnScaleGestureDetector… Also, here: http://developer.android.com/reference/android/view/MotionEvent.html it is stated that:

    “The order in which individual pointers appear within a motion event is undefined. Thus the pointer index of a pointer can change from one event to the next but the pointer id of a pointer is guaranteed to remain constant as long as the pointer remains active. ”

    Shouldn’t pointer ID:s be used, to avoid 180 degrees angle shifts if the pointers switch index?

    Cheers,
    Tore

  2. Sawera
    Posted November 29, 2012 at 1:42 PM | Permalink

    Very nice work! it helped alot thanks,
    I am stuck at a point, i want to do all this with multiple imageViews, I am able to achieve move on touch but as there are 2 imageviews added only latest added imageviews is movable.

    I guess problem is layout_width=”fill_parent” which causes only front imageview to be recognized on touch. and If I am using layout_width=”wrap_content” than imageview only moves in that image sized area and being invisible.

    Can u help me in solving this? i want to move, zoom and rotate multiple images individually on a screen.

  3. ggggura
    Posted December 18, 2012 at 9:14 AM | Permalink

    Thank you for your work
    I needed RotateGestureDetector
    it helped alot…

  4. ducdx
    Posted December 28, 2012 at 10:25 AM | Permalink

    Very, very good! Excelent guide!
    But I have a problem as following:
    Your code work well with an ImageView which has attribute FILL_PARENT, therefore I can’t use multiple ImageView. Now, I have many ImageView, how to use your framework in order to move, rotate, zoom each ImageView independent? Can you help me?
    Sorry because of my bad English!

  5. Chani
    Posted January 18, 2013 at 8:28 PM | Permalink

    nice… but, I’d like to use it at work. and for that, I need to know: what license is it under? (bsd would be nice)

  6. Almeros
    Posted January 21, 2013 at 5:27 PM | Permalink

    Thanks guys for your interest and your comments. I haven’t had time (nor will) to really go into them unfortunately. Some quick comments:

    Tore, I created this for my needs. Please feel free to update it to be more complete. Send me a pull request afterwards!

    Sawera and ducdx, this is possible but you should really try to think on how to do this yourself. This framework performs no magic ;) . Find out how to select one of many items on a screen, whereafter you could use for example rotation on that selected item and not on the others.

    Chani, it will probably be BSD, (for now please add a copyright notice to Almeros with a link to this site). I hope that when you add stuff you are using in your company’s app which other could enjoy, you’ll send me a pull request with those additions.

    Thanks

  7. Dhrupal
    Posted April 5, 2013 at 7:41 AM | Permalink

    Sawera and ducdx, I found solution for multiple view move event. Hope it help you.
    http://blahti.wordpress.com/2011/01/17/moving-views-part-2/

  8. Priyank
    Posted April 30, 2013 at 2:33 PM | Permalink

    Thanks a lot for the tutorial. It helped me a lot. Thumbs Up.

  9. morph85
    Posted July 11, 2013 at 4:38 AM | Permalink

    It is not working when the view is in nested layout..

  10. Edu Rodríguez
    Posted July 21, 2013 at 12:14 PM | Permalink

    Thank you,

    It works very well :-)

  11. Bhavin Parmar
    Posted August 2, 2013 at 10:50 AM | Permalink

    Great Work Man…!

  12. Anusha
    Posted August 10, 2013 at 1:05 PM | Permalink

    As you have explained the above program done for image view.how can i done the program for screen layout.can you help me..

  13. dandroid
    Posted October 26, 2013 at 6:33 AM | Permalink

    Can you please provide the hint code for how to handle rotation on the Two Different Imageview i am searching for two days but still no luck .if its just few lines of code please provide it . I tried to search for what u have told to one of the users Here
    I tried RequestFocusIntouch() mode etc but still no luck.

    Both image view’s Scale And Move Gesture detectors working fine just problem is with rotation Gesture Detector don’t know why.

    if some one know solution plz provide it

  14. Reeya
    Posted January 18, 2014 at 12:46 PM | Permalink

    Hi dandroid can u pls tell me how to hand multiple images. if i have i can 2 images both are moving in a imageview area not the whole layout.

  15. Andrii
    Posted February 7, 2014 at 4:59 PM | Permalink

    Hi, can you help to figure out why after imageview was rotated using setRotation(angle) method gesture detectors stopped to work. And when I set angle back to 0 they start to work again.

  16. Vishal Vaja
    Posted March 3, 2014 at 12:56 PM | Permalink

    Great work buddy…really helpful…!!!

  17. Alexandra
    Posted May 25, 2014 at 2:43 PM | Permalink

    Hey ! I downloaded your code and I integrated it in my application , I checked your source folder, your layout xml code and your android manifest .. but I run it and it seems that it does not detect any gesture .. I don’t know why :( ..

  18. Alexandra
    Posted May 25, 2014 at 2:54 PM | Permalink

    It works ! Thank you very much for your help !! :)

Post a Comment

Your email is never published nor shared. Required fields are marked *

*
*

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>

  • Almeros’ Code Blog

    Welcome to Almeros' Code Blog at code.almeros.com. I'm a Software Engineer living in Groningen, The Netherlands.

    Too many times I found out some coding stuff I really want to share with you all; interesting new coding stuff, problems I found a solution for, things bothering me, etc.

    After many years of getting the knowledge and help from you (other blogging developers), it's time to give something back! That's why I started this blog. I hope you find it useful and that my posts save your day!