Flutter (57): Gesture Recognition

Time: Column:Mobile & Frontend views:234

57.1 GestureDetector

This section introduces the GestureDetector and GestureRecognizer used in Flutter to handle gestures, followed by a detailed discussion on gesture competition and conflict.

GestureDetector is a functional component used for gesture recognition, allowing us to identify various gestures. It encapsulates a Listener internally to recognize semantic gestures. Next, we’ll detail the recognition of different gestures.

1. Tap, Double Tap, and Long Press

We use GestureDetector to recognize gestures on a Container. When an event is triggered, the event name is displayed on the Container. To increase the tap area, the Container is set to 200×100. The code is as follows:

class _GestureTestState extends State<GestureTest> {
  String _operation = "No Gesture detected!"; // Store event name

  @override
  Widget build(BuildContext context) {
    return Center(
      child: GestureDetector(
        child: Container(
          alignment: Alignment.center,
          color: Colors.blue,
          width: 200.0,
          height: 100.0,
          child: Text(
            _operation,
            style: TextStyle(color: Colors.white),
          ),
        ),
        onTap: () => updateText("Tap"), // Tap
        onDoubleTap: () => updateText("DoubleTap"), // Double tap
        onLongPress: () => updateText("LongPress"), // Long press
      ),
    );
  }

  void updateText(String text) {
    // Update displayed event name
    setState(() {
      _operation = text;
    });
  }
}

Note: When listening to both onTap and onDoubleTap events, there may be a delay of about 200 milliseconds when a tap event is triggered. This is because after a tap, the user might attempt a double tap, so GestureDetector waits for a moment to determine if it’s a double tap. If only onTap is listened for (without onDoubleTap), there’s no delay.

2. Dragging and Sliding

A complete gesture process refers to the entire process from the user's finger pressing down to lifting it up. During this, the user may move their finger or may not. GestureDetector does not distinguish between dragging and sliding events; they are essentially the same. It uses the top-left corner of the component being listened to as the origin for the gesture. The gesture recognition begins when the user presses their finger on the component being listened to. Below is an example of dragging a circular letter "A":

class _Drag extends StatefulWidget {
  @override
  _DragState createState() => _DragState();
}

class _DragState extends State<_Drag> with SingleTickerProviderStateMixin {
  double _top = 0.0; // Offset from the top
  double _left = 0.0; // Offset from the left

  @override
  Widget build(BuildContext context) {
    return Stack(
      children: <Widget>[
        Positioned(
          top: _top,
          left: _left,
          child: GestureDetector(
            child: CircleAvatar(child: Text("A")),
            // Triggered when the finger presses down
            onPanDown: (DragDownDetails e) {
              // Print the position of the finger press (relative to the screen)
              print("User finger down: ${e.globalPosition}");
            },
            // Triggered when the finger slides
            onPanUpdate: (DragUpdateDetails e) {
              // Update offsets and rebuild while the user slides
              setState(() {
                _left += e.delta.dx;
                _top += e.delta.dy;
              });
            },
            onPanEnd: (DragEndDetails e) {
              // Print the velocity on the x and y axes when sliding ends
              print(e.velocity);
            },
          ),
        )
      ],
    );
  }
}

Log:

I/flutter ( 8513): User finger down: Offset(26.3, 101.8)
I/flutter ( 8513): Velocity(235.5, 125.8)

Code Explanation:

  • DragDownDetails.globalPosition: This property represents the user's press position relative to the screen (not the parent component) origin (top-left corner).

  • DragUpdateDetails.delta: This indicates the offset of movement during each Update event triggered while the user slides on the screen.

  • DragEndDetails.velocity: This property represents the sliding speed when the user lifts their finger (including both x and y axes). In this example, the finger lift speed is not processed, but a common effect is to create a deceleration animation based on the lift speed.

Single-Direction Dragging

In the previous example, dragging can occur in any direction. However, in many scenarios, we may only need to drag in one direction, such as in a vertical list. GestureDetector can recognize gesture events in a specific direction. We can modify the above example to allow dragging only vertically:

class _DragVertical extends StatefulWidget {
  @override
  _DragVerticalState createState() => _DragVerticalState();
}

class _DragVerticalState extends State<_DragVertical> {
  double _top = 0.0;

  @override
  Widget build(BuildContext context) {
    return Stack(
      children: <Widget>[
        Positioned(
          top: _top,
          child: GestureDetector(
            child: CircleAvatar(child: Text("A")),
            // Vertical drag event
            onVerticalDragUpdate: (DragUpdateDetails details) {
              setState(() {
                _top += details.delta.dy;
              });
            },
          ),
        )
      ],
    );
  }
}

This allows dragging only in the vertical direction. Similarly, you can restrict sliding to the horizontal direction.

3. Scaling

GestureDetector can listen for scaling events. The following example demonstrates a simple image scaling effect:

class _Scale extends StatefulWidget {
  const _Scale({Key? key}) : super(key: key);

  @override
  _ScaleState createState() => _ScaleState();
}

class _ScaleState extends State<_Scale> {
  double _width = 200.0; // Adjust image width for scaling effect

  @override
  Widget build(BuildContext context) {
    return Center(
      child: GestureDetector(
        // Set width, height adjusts automatically
        child: Image.asset("./images/sea.png", width: _width),
        onScaleUpdate: (ScaleUpdateDetails details) {
          setState(() {
            // Scale factor between 0.8 and 10 times
            _width = 200 * details.scale.clamp(.8, 10.0);
          });
        },
      ),
    );
  }
}

The effect after running is shown in Figure:

Flutter (57): Gesture Recognition

Now you can pinch in and out on the image to zoom in and out. This example is simple; in practice, we often need additional features, such as double-tapping to zoom in or out by a certain factor, or performing a deceleration zoom animation when both fingers leave the screen. Readers can try to implement these features after learning about the "Animation" chapter later.

57.2 GestureRecognizer

Inside the GestureDetector, one or more GestureRecognizers are used to recognize various gestures. The role of a GestureRecognizer is to convert raw pointer events into semantic gestures via a Listener. The GestureDetector can directly accept a child widget. GestureRecognizer is an abstract class, and each gesture recognizer corresponds to a subclass of GestureRecognizer. Flutter implements a rich set of gesture recognizers that we can use directly.

Example

Suppose we want to add click event handlers to different parts of a rich text (RichText), but since TextSpan is not a widget, we cannot use GestureDetector. However, TextSpan has a recognizer property that can accept a GestureRecognizer.

Let’s say we want to change the color of the text when clicked:

import 'package:flutter/gestures.dart';

class _GestureRecognizer extends StatefulWidget {
  const _GestureRecognizer({Key? key}) : super(key: key);

  @override
  _GestureRecognizerState createState() => _GestureRecognizerState();
}

class _GestureRecognizerState extends State<_GestureRecognizer> {
  TapGestureRecognizer _tapGestureRecognizer = TapGestureRecognizer();
  bool _toggle = false; // Color toggle switch

  @override
  void dispose() {
    // Always call dispose() to release resources when using GestureRecognizer
    _tapGestureRecognizer.dispose();
    super.dispose();
  }

  @override
  Widget build(BuildContext context) {
    return Center(
      child: Text.rich(
        TextSpan(
          children: [
            TextSpan(text: "Hello World"),
            TextSpan(
              text: "Click me to change color",
              style: TextStyle(
                fontSize: 30.0,
                color: _toggle ? Colors.blue : Colors.red,
              ),
              recognizer: _tapGestureRecognizer
                ..onTap = () {
                  setState(() {
                    _toggle = !_toggle;
                  });
                },
            ),
            TextSpan(text: "Hello World"),
          ],
        ),
      ),
    );
  }
}

Note: Always call the dispose() method to release resources when using GestureRecognizer (mainly to cancel internal timers).