Models 8-10: Automated Guided Vehicle

Safely on the road

Grade
11-13
Time required
2-3 double hours per learning unit (extendable to 13 DS)
Difficulty level
Model: medium, programming: light to heavy
Model Type
Table model for driverless transport vehicles

MODEL DESCRIPTION/TASK

The students plan and implement driverless transport vehicles (FTF), which they equip step by step with further sensors and intelligent control. Starting with a basic vehicle with encoder motors, which carries out defined routes and rotations and manages the first line sequences with the help of a lane sensor, via an AGV with ultrasonic sensor for collision avoidance and USB camera for interaction with coloured surfaces up to control loop-controlled or even AI-supported line sequences, the models are evolving - and thus also the requirements for design, wiring and programming.

The students learn to create state transition diagrams for driving states and to read and use sensor data in a targeted manner. From the properties and measured values of the sensors, they determine suitable driving and steering groups, define variables for path and angle control as well as reaction times for reliable course corrections. 

From the states and parameters thus determined, the SuS first develop rule-based control programs with variables, subprograms and state logic; then they configure and train a neural network for a regression problem that predicts suitable motor speeds
from sensor inputs and is extended to include the distance sensor as a differentiation. The SuS test suitability in driving tests on the course - straight-line stability, lane keeping, evasive action and colour surface reactions - and improve their solutions through systematic troubleshooting and speed optimization.

EVERYDAY COVERAGE

The sensor-based control of a vehicle is familiar to students from technical or IT lessons and their everyday life. Known applications include cruise control and lane keeping assistant in cars, e-scooters with sensors and vacuum robots that follow lines and avoid obstacles.

Embedding it in a realistic mobility context creates a high level of motivation because the students immediately recognises parallels to urban traffic and warehouse logistics. Integration of the topic into professional orientation is suitable in automotive engineering, electrical engineering and robotics/automation, where sensor-driven control and automated actuation of actuators are central competences.
The combination of sensors and AI-supported control not only meets students in industry and transport, but also in the home environment, e.g. in smart home applications, intelligent heating or automatic lighting - wherever measured values trigger decisions and movements are reliably executed.

Key Questions

  • Which sensors and control types are suitable for an automated guided vehicle (AGV) on the course? 
    (Communication)
  • How do you coordinate different sensor values (IR sensor, ultrasonic sensor, USB camera) for safe navigation? (Collaboration)
  • What compromises between speed, precision and safety make sense? (critical thinking)
  • How can the AGV’s behaviour be developed from a rule-based to an AI-supported control system? (Creativity)

Compartment cover

Informatics
Advanced programming, condition loops, functions, state machines, Event control, camera integration, P and PD controllers, neural networks, Training of a neural network
Mathematics
Calculation of terms, scaling, proportionality and linear functions, unit conversion, normalisation, regression
Technology
Stable construction, construction technology
Physics
Movement (travel, time, speed), signal processing, Ultrasonic runtime measurement and sound propagation, inertia and braking distance, colour detection, control technology

Lesson Plan

Introduction phase
Planning phase
Design phase for driving training (FTF 1)
Programming phase for driving training (FTF 1)
Experimentation and test phase for driving training (FTF 1)
Final/follow-up phase for driver training (FTF 1)
Design phase for the digital tracker (FTF 2)
Programming phase for the digital tracker (FTF 2)
Experimentation and test phase for the digital tracker (FTF 2)
Completion/connection phase for the digital tracker (FTF 2)
Design phase for the analogue track tracker (FTF 3)
Programming phase for the analogue tracker (FTF 3)
Experimentation and test phase for the analogue tracker (FTF 3)
Final phase for the analogue tracker (FTF 3)
Design phase of the AI line follower (FTF 4)
AI line follower programming phase (FTF 4)
Experimentation and test phase of the AI line follower (FTF 4)
Closing phase of the AI line follower (FTF 4)

 

Notes and information

Methodological and didactic notes

Differentiation options

Depending on the duration of the lesson series and the strength of the students,

  • increase the complexity of the task by involving multiple sensors. 
  • an expansion of the AGV’s capabilities through more complex strategies for avoiding obstacles or capturing and responding to coloured 
    stimuli from the environment.
  • the handling characteristics of the AGV can be made more uniform by using P and PD controllers. 
  • a neural network takes over the engine control when following lines or braking.

Motivational aspects

Working with driverless transport vehicles (AGVs) is directly linked to the daily experiences of students. Even during driving training, they experience how only a few lines of the program set a vehicle reliably in motion - direct feedback that shows a high level of motivation and arouses curiosity. With the digital track tracker, everyday reference is reinforced by parallels to robotic vacuum cleaners, robotic lawn mowers or driver assistance systems that recognise lines and automatically avoid obstacles. The idea that the AGV can now “see” and react like a robot promotes identification with the task and increases interest in practical testing.

The analogue track tracker with camera use and control technology opens up exciting insights into modern vehicle technologies
such as Lane Keeping Assist. Here, it becomes clear how parameter changes or the addition of a differential component influence the handling characteristics - a motivating field of experimentation. 

Finally, with the AI line follower, artificial intelligence takes centre stage: Students experience how training a neural network improves driving behaviour. This bridges current discussions about AI in everyday life and gives SuS the opportunity to understand and apply future technologies themselves.





 


Supplementary materials

  • If available, a video could be used for the introduction phase to the topic.
  • Drawing media (paper, whiteboard or projection surface).

Functions of the model and their technical solutions


Functions of the sensors/actuators

 


Technical solution

 

 

Rotation of the encoder motors

Speed adjustment for

Controlling the vehicle

 

 

Mini button

 

 

Obstacle detection (FTF 1)

 
Measurement of differences in brightness

 

 
Detecting the lane (FTF 1-2, FTF 4)

Measuring distances Collision avoidance (FTF 2, FTF 4)

FTF 2: Colour recognition by USB camera

Reaction to painted surfaces

FTF 3
: Lane recognition by USB camera

Lane-keeping using a P controller and
a PD controller
 
FTF 4: Lane recognition by lane sensor
  • Building a neural network 
  • Entering training data 
  • Implementing a regression problem
    using AI
Further differentiation options Optimization of speed control, optimization of strategies for avoiding obstacles, optimization of strategy for regaining lane
in the event of complete lane loss, design of own vehicle with self-selected sensor equipment

Material List

 

Sensors

 

Function

1 On/Off button on the TXT 4.0 controller

Switching on the AGV

2 mini buttons

Obstacle detection (FTF 1)

1 track sensor (with 2 IR sensors)  Lane recognition (FTF 1-2, FTF 4)
1 ultrasonic sensor Distance measurement (FTF 2, FTF 4)
1 USB camera Colour recognition (FTF 2)
Lane recognition (FTF 3)

 

Actuator

Function

2 encoder motors

Drive systems 
2 LEDs (2 × white) Headlights (FTF 2-4)


cd-blue-7dc6bcccf-gmsms