build method

  1. @override
Widget build(
  1. BuildContext context
)
override

Describes the part of the user interface represented by this widget.

The framework calls this method in a number of different situations. For example:

This method can potentially be called in every frame and should not have any side effects beyond building a widget.

The framework replaces the subtree below this widget with the widget returned by this method, either by updating the existing subtree or by removing the subtree and inflating a new subtree, depending on whether the widget returned by this method can update the root of the existing subtree, as determined by calling Widget.canUpdate.

Typically implementations return a newly created constellation of widgets that are configured with information from this widget's constructor, the given BuildContext, and the internal state of this State object.

The given BuildContext contains information about the location in the tree at which this widget is being built. For example, the context provides the set of inherited widgets for this location in the tree. The BuildContext argument is always the same as the context property of this State object and will remain the same for the lifetime of this object. The BuildContext argument is provided redundantly here so that this method matches the signature for a WidgetBuilder.

Design discussion

Why is the build method on State, and not StatefulWidget?

Putting a Widget build(BuildContext context) method on State rather than putting a Widget build(BuildContext context, State state) method on StatefulWidget gives developers more flexibility when subclassing StatefulWidget.

For example, AnimatedWidget is a subclass of StatefulWidget that introduces an abstract Widget build(BuildContext context) method for its subclasses to implement. If StatefulWidget already had a build method that took a State argument, AnimatedWidget would be forced to provide its State object to subclasses even though its State object is an internal implementation detail of AnimatedWidget.

Conceptually, StatelessWidget could also be implemented as a subclass of StatefulWidget in a similar manner. If the build method were on StatefulWidget rather than State, that would not be possible anymore.

Putting the build function on State rather than StatefulWidget also helps avoid a category of bugs related to closures implicitly capturing this. If you defined a closure in a build function on a StatefulWidget, that closure would implicitly capture this, which is the current widget instance, and would have the (immutable) fields of that instance in scope:

// (this is not valid Flutter code)
class MyButton extends StatefulWidgetX {
  MyButton({super.key, required this.color});

  final Color color;

  @override
  Widget build(BuildContext context, State state) {
    return SpecialWidget(
      handler: () { print('color: $color'); },
    );
  }
}

For example, suppose the parent builds MyButton with color being blue, the $color in the print function refers to blue, as expected. Now, suppose the parent rebuilds MyButton with green. The closure created by the first build still implicitly refers to the original widget and the $color still prints blue even through the widget has been updated to green; should that closure outlive its widget, it would print outdated information.

In contrast, with the build function on the State object, closures created during build implicitly capture the State instance instead of the widget instance:

class MyButton extends StatefulWidget {
  const MyButton({super.key, this.color = Colors.teal});

  final Color color;
  // ...
}

class MyButtonState extends State<MyButton> {
  // ...
  @override
  Widget build(BuildContext context) {
    return SpecialWidget(
      handler: () { print('color: ${widget.color}'); },
    );
  }
}

Now when the parent rebuilds MyButton with green, the closure created by the first build still refers to State object, which is preserved across rebuilds, but the framework has updated that State object's widget property to refer to the new MyButton instance and ${widget.color} prints green, as expected.

See also:

  • StatefulWidget, which contains the discussion on performance considerations.

Implementation

@override
Widget build(BuildContext context) {
  // Keep the value of text field.

  _nnetSizeLayerController.text =
      ref.read(hiddenLayerNeuralProvider.notifier).state.toString();
  _maxNWtsController.text =
      ref.read(maxNWtsProvider.notifier).state.toString();
  _maxitController.text =
      ref.read(maxitNeuralProvider.notifier).state.toString();

  String algorithm = ref.read(algorithmNeuralProvider.notifier).state;
  String function = ref.read(errorFctNeuralProvider.notifier).state;
  String activation = ref.read(activationFctNeuralProvider.notifier).state;

  return Column(
    spacing: configRowSpace,
    children: [
      configTopGap,
      Row(
        spacing: configWidgetSpace,
        children: [
          configLeftGap,
          ActivityButton(
            key: const Key('Build Neural Network'),
            tooltip: '''

            Tap to build a Neural Network model using the parameter values
            that you can set here.

            ''',
            pageControllerProvider:
                neuralPageControllerProvider, // Optional navigation

            onPressed: () async {
              // Perform manual validation.
              String? sizeHiddenLayerError =
                  validateInteger(_nnetSizeLayerController.text, min: 1);
              String? maxNWtsError =
                  validateInteger(_maxNWtsController.text, min: 1);
              String? maxitError =
                  validateInteger(_maxitController.text, min: 1);

              // Collect all errors.
              List<String> errors = [
                if (sizeHiddenLayerError != null)
                  'Size Hidden Layer: $sizeHiddenLayerError',
                if (maxNWtsError != null) 'Max NWts: $maxNWtsError',
                if (maxitError != null) 'Maxit: $maxitError',
              ];

              // Check if there are any errors.
              if (errors.isNotEmpty) {
                // Show a warning dialog if validation fails.
                showDialog(
                  context: context,
                  builder: (context) => AlertDialog(
                    title: const Text('Validation Error'),
                    content: Text(
                      'Please ensure all input fields are valid before building the nnet model:\n\n${errors.join('\n')}',
                    ),
                    actions: [
                      TextButton(
                        onPressed: () {
                          Navigator.of(context).pop();
                        },
                        child: const Text('OK'),
                      ),
                    ],
                  ),
                );

                return;
              } else {
                ref.read(hiddenLayerNeuralProvider.notifier).state =
                    int.parse(_nnetSizeLayerController.text);
                ref.read(maxNWtsProvider.notifier).state =
                    int.parse(_maxNWtsController.text);
                ref.read(maxitNeuralProvider.notifier).state =
                    int.parse(_maxitController.text);

                // Run the R scripts.

                String mt = 'model_template';
                String mbn = 'model_build_neuralnet';
                String mbnn = 'model_build_nnet';

                if (context.mounted) {
                  if (algorithm == 'nnet') {
                    await rSource(
                      context,
                      ref,
                      [mt, mbnn],
                    );
                    ref.read(nnetEvaluateProvider.notifier).state = true;
                  } else if (algorithm == 'neuralnet') {
                    await rSource(
                      context,
                      ref,
                      [mt, mbn],
                    );
                    ref.read(neuralNetEvaluateProvider.notifier).state = true;
                  }
                }
              }

              // Update the state to make the neural evaluate tick box
              // automatically selected after the model build.

              ref.read(neuralEvaluateProvider.notifier).state = true;

              await ref.read(neuralPageControllerProvider).animateToPage(
                    // Index of the second page.
                    1,
                    duration: const Duration(milliseconds: 300),
                    curve: Curves.easeInOut,
                  );
            },
            child: const Text('Build Neural Network'),
          ),
          Text('Target: ${getTarget(ref)}'),
          ChoiceChipTip<String>(
            options: neuralAlgorithm.keys.toList(),
            selectedOption: algorithm,
            tooltips: neuralAlgorithm,
            onSelected: (chosen) {
              setState(() {
                if (chosen != null) {
                  algorithm = chosen;
                  ref.read(algorithmNeuralProvider.notifier).state = chosen;
                }
              });
            },
          ),
          LabelledCheckbox(
            key: const Key('NNET Trace'),
            tooltip: '''

            Enable tracing optimization for the **single layer neural
            network**. The prediction error is provided after every 10 training
            iterations in the Console.

            ''',
            label: 'Trace',
            provider: traceNeuralProvider,
            enabled: algorithm == 'nnet',
          ),
          LabelledCheckbox(
            tooltip: '''

            Add skip-layer connections from input to output for the **single
            layer neural network**.

            ''',
            label: 'Skip',
            provider: skipNeuralProvider,
            enabled: algorithm == 'nnet',
          ),
          LabelledCheckbox(
            key: const Key('Neural Ignore Categoric'),
            tooltip: '''

            Build the model ignoring the categoric variables. Categoric
            variables are handled by the neural net models by enumerating
            their levels across the other variables. Because this can result
            in many introduced variables we enable Ignore Categoric by
            default.

            ''',
            label: 'Ignore Categoric',
            provider: ignoreCategoricNeuralProvider,
          ),
        ],
      ),
      Row(
        spacing: configWidgetSpace,
        children: [
          configBotGap,
          algorithm == 'nnet'
              ? NumberField(
                  label: 'Hidden Layer:',
                  key: const Key('hidden_layers'),
                  controller: _nnetSizeLayerController,

                  tooltip: '''

                  The Hidden Layer parameter is used as the size parameter in
                  the nnet() model which specifies the number of units
                  (neurons) in the single hidden layer of the neural
                  network. It is a simple integer.

                  ''',
                  inputFormatter:
                      FilteringTextInputFormatter.digitsOnly, // Integers only
                  validator: (value) => validateInteger(value, min: 1),
                  stateProvider: hiddenLayerNeuralProvider,
                )
              : VectorNumberField(
                  controller: _neuralHiddenController,
                  stateProvider: hiddenLayersNeuralProvider,
                  label: 'Hidden Layers',
                  tooltip: '''

                  The Hidden Layers parameter is a vector of comma separated
                  integers specifying the number of hidden neurons (vertices)
                  in each of the layers. The number of layers is the number of
                  integers supplied. For example, to specify a network
                  architecture with 10 neurons in the first layer, 5 int he
                  next, and 2 in the penultimate layer, we specify "10,5,2".

                  ''',
                  validator: validateVector,
                  inputFormatter:
                      FilteringTextInputFormatter.allow(RegExp(r'[0-9,\s]')),
                ),
          NumberField(
            label: 'Max Iterations:',
            key: const Key('maxit'),
            controller: _maxitController,
            enabled: algorithm == 'nnet',

            tooltip: '''

            The maximum number of iterations (or epochs) allowed during the
            training of the neural network.

            ''',
            inputFormatter:
                FilteringTextInputFormatter.digitsOnly, // Integers only
            validator: (value) => validateInteger(value, min: 1),
            stateProvider: maxitNeuralProvider,
          ),
          NumberField(
            label: 'Max Weights:',
            key: const Key('max_nwts'),
            controller: _maxNWtsController,
            enabled: algorithm == 'nnet',

            tooltip: '''

            The maximum number of weights allowed in the neural network model.

            ''',
            inputFormatter:
                FilteringTextInputFormatter.digitsOnly, // Integers only
            validator: (value) => validateInteger(value, min: 1),
            stateProvider: maxNWtsProvider,
          ),
          NumberField(
            label: 'Threshold:',
            key: const Key('thresholdNeuralField'),
            controller: _thresholdController,
            tooltip: '''

              The numeric value specifying the threshold for the partial
              derivatives of the error function as stopping criteria.

              ''',
            enabled: algorithm != 'nnet',
            inputFormatter: FilteringTextInputFormatter.allow(
              RegExp(r'^[0-9]*\.?[0-9]{0,4}$'),
            ),
            validator: (value) => validateDecimal(value),
            stateProvider: thresholdNeuralProvider,
            interval: 0.0005,
            decimalPlaces: 4,
          ),
          NumberField(
            label: 'Max Steps:',
            key: const Key('neuralMaxStepField'),
            controller: _maxStepsController,
            tooltip: '''

              The maximum steps for the training of the neural network.
              Reaching this maximum leads to a stop of the neural network's training process.

              ''',
            enabled: algorithm != 'nnet',
            inputFormatter:
                FilteringTextInputFormatter.digitsOnly, // Integers only
            validator: (value) => validateInteger(value, min: 1000),
            stateProvider: stepMaxNeuralProvider,
          ),
        ],
      ),
      Row(
        spacing: configWidgetSpace,
        children: [
          configBotGap,
          variableChooser(
            'Error Function',
            errorFunction,
            function,
            ref,
            errorFctNeuralProvider,
            tooltip: '''

            Function that is used for the calculation of the error.
            Alternatively, the strings 'sse' and 'ce' which stand for
            the sum of squared errors and the cross-entropy can be used.

            ''',
            enabled: algorithm == 'neuralnet',
            onChanged: (String? value) {
              if (value != null) {
                ref.read(errorFctNeuralProvider.notifier).state = value;
              }
            },
          ),
          variableChooser(
            'Activation Function',
            activationFunction,
            activation,
            ref,
            activationFctNeuralProvider,
            tooltip: '''

            Function that is used for smoothing the result of the cross product
            of the covariate or neurons and the weights. Additionally the strings,
            'logistic' and 'tanh' are possible for the logistic function and
            tangent hyperbolicus.

            ''',
            enabled: algorithm == 'neuralnet',
            onChanged: (String? value) {
              if (value != null) {
                ref.read(activationFctNeuralProvider.notifier).state = value;
              }
            },
          ),
        ],
      ),
    ],
  );
}