Android has a ScaleGestureDetector since API level 8 (Android 2.2). From this point on the first Android devices with multitouch screens appeared. This was great, and the ScaleGestureDetector too ofcourse.
What’s a mistery to me however, is why this is the only multitouch gesture detector that is in the API! Why isn’t there something like a RotateGestureDetector to find out rotations between two (or more) fingers… Is it one of the issues in the patent war between Apple and Google? I don’t know, but I want to be able to use it anyway… so I built this small extendable framework and I’ll explain in this article how you can use it in your Android apps! Now hope I won’t get sued for building the obvious :shock:!
Why I built this small framework
When I googled for solutions to read out rotations between two fingers on the screen, I found many solutions from people dealing with the same situation, but all not so much Object Oriented (no interfaces, abstract classes, etc) and therefor more difficult to reuse and extend.
Also, I imagine that someday (when the patent wars are over ;)) Google will add something like a RotationGestureDetector to the API in the same way the ScaleGestureDetector is working already. At that point I would like to only change the import statement and not my whole activity implementation (no guarantees here ;)). This is why I wanted to build this structure of gesture detectors. You can download (zip) or clone the framework, including an example Android app, right here from Github. Import it in your Eclipse workspace and off you go.
How to use Android’s ScaleGestureDetector
Let’s start with a tutorial on how to use the ScaleGestureDetector in your Activity…
Basic structure
In your Activity
class you implement the android.view.View.OnTouchListener
which makes you implement the onTouch(...)
method. This way the Android system is able to inform the class any android.view.MotionEvent
is happening.
public class TouchActivity extends Activity implements OnTouchListener { public void onCreate(Bundle savedInstanceState) { // Init stuff here ... } public boolean onTouch(View v, MotionEvent event) { // Handle touch events here ... } ... }
Now it is possible for you to read out the multitouch events in the onTouch(...)
method and handle everything… but trust me, you’ll end up with a clutter of code which will be hard for you or your collegues to maintain! Also, the android.view.ScaleGestureDetector
is here already for you to use anyway if you’re targeting API level 8 or higher, which you probably are at the time you are reading this article! Let’s delegate the eventhandling to the ScaleGestureDetector
!
public class TouchActivity extends Activity implements OnTouchListener { private ScaleGestureDetector mScaleDetector; public void onCreate(Bundle savedInstanceState) { mScaleDetector = new ScaleGestureDetector(getApplicationContext(), new ScaleListener()); ... } public boolean onTouch(View v, MotionEvent event) { mScaleDetector.onTouchEvent(event); } ... }
Ofcourse first we have to create an instance of the ScaleGestureDetector
in the onCreate(...)
method. Via the local variable mScaleDetector
we delegate the MotionEvent
to the mScaleDetector.onTouchEvent(event)
. Great! Now the ScaleGestureDetector
can figure out for us if scaling is performed. It’s of our sleaves!… Oh wait, how do we handle a user’s scale action than, just so we can do something with it in our Activity
?
Handling events with a Listener
Maybe you’ve noticed the initialisation of the ScaleListener
when we inited our ScaleGestureDetector
. This listener is our way to know when a scale event has happened and what scaling information is available. Let’s implement this ScaleListener
now as a private inner class:
public class TouchActivity extends Activity implements OnTouchListener ... private class ScaleListener extends ScaleGestureDetector.SimpleOnScaleGestureListener { @Override public boolean onScale(ScaleGestureDetector detector) { mScaleSpan = detector.getCurrentSpan(); // average distance between fingers return true; } } }
The ScaleListener
class is an implementation of ScaleGestureDetector
’s inner interface SimpleOnScaleGestureListener
. This way the ScaleGestureDetector
‘knows’ how to call our implementation when a scale event happened. In the onScale(...)
method, we can read out the data we need from the given ScaleGestureDetector
.
Using the scale data
Since the ScaleListener
is an inner class, we can use the same local variables as in our Activity
class to store our results for later use. This could be your action rendering something… you guessed… scaled on the screen ;). A good place for this could be the onTouch(...)
method again.
public class TouchActivity extends Activity implements OnTouchListener private float mScaleSpan = 1.0f; private ScaleGestureDetector mScaleDetector; ... public boolean onTouch(View v, MotionEvent event) { mScaleDetector.onTouchEvent(event); // ScaleDetector handled event at this point. // Perform your magic with mScaleSpan now! ... return true; // indicate event was handled } private class ScaleListener extends ScaleGestureDetector.SimpleOnScaleGestureListener { @Override public boolean onScale(ScaleGestureDetector detector) { mScaleSpan = detector.getCurrentSpan(); // average distance between fingers return true; // indicate event was handled } } }
Preserving state between gestures
Now, detector.getCurrentSpan()
always returns the distance between two finger touches. This means that if for example you want to scale a photo on the screen, you can’t scale it with multiple scale gestures following up eachother. The user must scale the photo in one gesture! That’s not the usability we want to offer in this case. We want to preserve the previous scale state so the user can ‘add’ additional scaling to it. This is something you can change in the listener. We can use the ScaleGestureDetector.getScaleFactor()
here. And to preserve the state, use *=
instead of =.
public class TouchActivity extends Activity implements OnTouchListener private float mScaleFactor = 1.0f; ... private class ScaleListener extends ScaleGestureDetector.SimpleOnScaleGestureListener { @Override public boolean onScale(ScaleGestureDetector detector) { mScaleFactor *= detector.getScaleFactor(); // scale change since previous event return true; // indicate event was handled } } }
Almeros’s multitouch gesture detectors
OK, now you know how you can setup your activity in an Object Oriented way with the standard multitouch gesture detector in the Android API. The gesture detectors I introduce can be used in the same way. I’ll start by explaining what the two different gesture detectors detect ;). Also, note that these multitouch gesture detectors are still working with more than two fingers on the screen!
RotateGestureDetector
Two fingers on the screen represent a line between them. In a previous (first) onRotate
event an angle was determined towards the normal. In the current event the RotateGestureDetector
is able to determine the angle difference between the previous and the current event’s lines.
Only when two fingers are on screen, this angle can be determined ofcourse, so only then the listener will receive a call to the onRotate(...)
method. At this point you can read out the current angle difference.
Same as above with the ScaleGestureDetector, we want to preserve the rotation state so a user can rotate (an image) with multiple progressing rotation gestures. Use -=
instead of using =
.
public class TouchActivity extends Activity implements OnTouchListener { private float mRotationDegrees = 0.f; ... private class RotateListener extends RotateGestureDetector.SimpleOnRotateGestureListener { @Override public boolean onRotate(RotateGestureDetector detector) { mRotationDegrees -= detector.getRotationDegreesDelta(); return true; } } ... }
MoveGestureDetector
This one is actually more a convenience gesture detector. You could do very well without this one but to keep your code clear, more general and Object Oriented I think this one deserves its spot in the framework.
And again… to preserve the moved state between multiple gestures we use a delta distance from the previous event till the current one. If you don’t want that, you can use MoveGestureDetector.getFocusX()
and MoveGestureDetector.getFocusY()
.
public class TouchActivity extends Activity implements OnTouchListener { private float mFocusX = 0.f; private float mFocusY = 0.f; ... private class MoveListener extends MoveGestureDetector.SimpleOnMoveGestureListener { @Override public boolean onMove(MoveGestureDetector detector) { PointF d = detector.getFocusDelta(); mFocusX += d.x; mFocusY += d.y; // mFocusX = detector.getFocusX(); // mFocusY = detector.getFocusY(); return true; } } ... }
All together now… To-geee-ther!
Yes you may sing along ;). You can combine all these gesture detectors to give a user total control over the object on his/her screen.
... import com.almeros.android.multitouch.gesturedetectors.MoveGestureDetector; import com.almeros.android.multitouch.gesturedetectors.RotateGestureDetector; public class TouchActivity extends Activity implements OnTouchListener { ... private float mScaleFactor = 1.0f; private float mRotationDegrees = 0.f; private float mFocusX = 0.f; private float mFocusY = 0.f; private ScaleGestureDetector mScaleDetector; private RotateGestureDetector mRotateDetector; private MoveGestureDetector mMoveDetector; @Override public void onCreate(Bundle savedInstanceState) { ... // Setup Gesture Detectors mScaleDetector = new ScaleGestureDetector(getApplicationContext(), new ScaleListener()); mRotateDetector = new RotateGestureDetector(getApplicationContext(), new RotateListener()); mMoveDetector = new MoveGestureDetector(getApplicationContext(), new MoveListener()); } public boolean onTouch(View v, MotionEvent event) { mScaleDetector.onTouchEvent(event); mRotateDetector.onTouchEvent(event); mMoveDetector.onTouchEvent(event); // Mmmmmhhhagic!!! // with: mScaleFactor, mRotationDegrees, mFocusX and mFocusY ... return true; // indicate event was handled } private class ScaleListener extends ScaleGestureDetector.SimpleOnScaleGestureListener { @Override public boolean onScale(ScaleGestureDetector detector) { mScaleFactor *= detector.getScaleFactor(); // scale change since previous event return true; } } private class RotateListener extends RotateGestureDetector.SimpleOnRotateGestureListener { @Override public boolean onRotate(RotateGestureDetector detector) { mRotationDegrees -= detector.getRotationDegreesDelta(); return true; } } private class MoveListener extends MoveGestureDetector.SimpleOnMoveGestureListener { @Override public boolean onMove(MoveGestureDetector detector) { PointF d = detector.getFocusDelta(); mFocusX += d.x; mFocusY += d.y; return true; } } }
Extending the framework with your own GestureDetectors
In the next article I’m going to write, I’ll explain how this framework is now built up and how you can extend it. I’ll also place the code with a simple test application in Github so you can all send me pull requests with your extentions!
I hope you like my efforts and I hope you enjoyed reading this tutorial. Cheers to multitouch on Android!
Hi!
Excellent guide and GesrureDetector, made me brave enough to tackle implementing my own GestureDetector (I had some other requirements for mine). I think you should add the neat methods getFocusX and getFocusY, that are present in OnScaleGestureDetector… Also, here: http://developer.android.com/reference/android/view/MotionEvent.html it is stated that:
“The order in which individual pointers appear within a motion event is undefined. Thus the pointer index of a pointer can change from one event to the next but the pointer id of a pointer is guaranteed to remain constant as long as the pointer remains active. ”
Shouldn’t pointer ID:s be used, to avoid 180 degrees angle shifts if the pointers switch index?
Cheers,
Tore
Very nice work! it helped alot thanks,
I am stuck at a point, i want to do all this with multiple imageViews, I am able to achieve move on touch but as there are 2 imageviews added only latest added imageviews is movable.
I guess problem is layout_width=”fill_parent” which causes only front imageview to be recognized on touch. and If I am using layout_width=”wrap_content” than imageview only moves in that image sized area and being invisible.
Can u help me in solving this? i want to move, zoom and rotate multiple images individually on a screen.
Thank you for your work
I needed RotateGestureDetector
it helped alot…
Very, very good! Excelent guide!
But I have a problem as following:
Your code work well with an ImageView which has attribute FILL_PARENT, therefore I can’t use multiple ImageView. Now, I have many ImageView, how to use your framework in order to move, rotate, zoom each ImageView independent? Can you help me?
Sorry because of my bad English!
nice… but, I’d like to use it at work. and for that, I need to know: what license is it under? (bsd would be nice)
Thanks guys for your interest and your comments. I haven’t had time (nor will) to really go into them unfortunately. Some quick comments:
Tore, I created this for my needs. Please feel free to update it to be more complete. Send me a pull request afterwards!
Sawera and ducdx, this is possible but you should really try to think on how to do this yourself. This framework performs no magic ;). Find out how to select one of many items on a screen, whereafter you could use for example rotation on that selected item and not on the others.
Chani, it will probably be BSD, (for now please add a copyright notice to Almeros with a link to this site). I hope that when you add stuff you are using in your company’s app which other could enjoy, you’ll send me a pull request with those additions.
Thanks
Sawera and ducdx, I found solution for multiple view move event. Hope it help you.
http://blahti.wordpress.com/2011/01/17/moving-views-part-2/
Thanks a lot for the tutorial. It helped me a lot. Thumbs Up.
It is not working when the view is in nested layout..
Thank you,
It works very well 🙂
Great Work Man…!
As you have explained the above program done for image view.how can i done the program for screen layout.can you help me..
Can you please provide the hint code for how to handle rotation on the Two Different Imageview i am searching for two days but still no luck .if its just few lines of code please provide it . I tried to search for what u have told to one of the users Here
I tried RequestFocusIntouch() mode etc but still no luck.
Both image view’s Scale And Move Gesture detectors working fine just problem is with rotation Gesture Detector don’t know why.
if some one know solution plz provide it
Hi dandroid can u pls tell me how to hand multiple images. if i have i can 2 images both are moving in a imageview area not the whole layout.
Hi, can you help to figure out why after imageview was rotated using setRotation(angle) method gesture detectors stopped to work. And when I set angle back to 0 they start to work again.
Great work buddy…really helpful…!!!
Hey ! I downloaded your code and I integrated it in my application , I checked your source folder, your layout xml code and your android manifest .. but I run it and it seems that it does not detect any gesture .. I don’t know why 🙁 ..
It works ! Thank you very much for your help !! 🙂
What is the chance of implementing this in a CustomView class? Something that extends view, and utilized OnTouchEvent(MotionEvent event)?
I see you use this in an activity, but I would be dynamically adding views to a layout.
Hi,
When i want import your library with gradle in android studio i get Failed to find error.
Please help.
Thanks for this, really helpful. But there’s one challenge this code is not answering at all, it’s what happens when you try to scale, rotate and move a view. Then the calculations have to be perfect, and scaling / rotating changes the position also for example.
Amazing lib man. I’ve been into Android for a couple of years now and it’s the toughest framework I’ve ever had to learn. It’s great when devs like yourself make our lives easier. When I get some spare time I’m gonna start doing the same. I have published some minor projects to stack overflow under my developer email warwickwestonwright@gmail.com.
Thanks again.
Just wanted to say a big fat thankyou! This is awesome. I tackled rotate/pan/scale myself but it wasn’t anywhere near as good as this. Nice work
Good work. This package works smoothly on my Android 4.4 device.
Thanks.
How can i achieve that if the rotate event is currently active then scale should not happen and when scale event is currently active then rotate should not happen?
Hey, I tried to import your library with gradle in android studio but I get
Error: Failed to resolve: com.almeros….
or
Error: Configuration with name ‘default’ not found
Could you help me with this problem?
Very good tutorial. Thank you explanations !!!!!