Artificial Intelligence

Historical intro to AI planning languages Not only Machine Learning drives our autonomous cars

This is my 2nd publication in field of Artificial Intelligence, prepared as a part of my project in AI Nanodegree classes. This time the goal was to write research paper about important historical developments in the field of AI planning and search. I hope you will like it 🙂.

Planning or more precisely: automated planning and scheduling is one of the major fields of AI (among the others like: Machine Learning, Natural Language Processing, Computer Vision and more). Planning focuses on realisation of strategies or action sequences executed by:

  • Intelligent agents — the autonomous entities (software of hardware) being able to observe the world through different types of sensors and perform actions based on those observations.
  • Autonomous robots — physical intelligent agents which deliver goods (factory robots), keep our house clean (intelligent vacuum cleaners) or discover outer worlds in space missions.
  • Unmanned vehicles — autonomous cars, drones or robotic spacecrafts.

To accomplish given tasks, these systems need to have input data containing descriptions of initial states of the world, desired goals and actions. And the role of planning systems is to find sequences of actions which lead from initial state to given goal.

Side note: This sounds simple, but it isn’t — there are different dimensions of problems to consider: deterministic or nondeterministic actions, fully or partially observable state of world, actions concurrency, their durations and time limits and many more. Without proper simplification, each problem can lead to combinatorial explosion — too many combinations of state variables which make the problem unsolvable by agent (state space size grows exponentially with variables).

To represent planning problems we use Artificial Intelligence planning languages that describe environment’s conditions which then lead to desired goals by generating chain of actions based on these conditions.


STRIPS is an action language which was a part of the first major planning system with the same name.

Shakey, the robot

Originally STRIPS was a name for the planning component in software used in Shakey, the robot developed at the Stanford Research Institute (SRI), which was the first machine to be able to reason about its own actions. Shakey with his abilities (visual analysis, route finding, object manipulation and more) is called an ancestor of self driving cars, military drones, Mars rovers and overall field of Robotics and AI. While Shakey’s hardware wasn’t very impressive, its software (architecture and algorithms) was a game changer in world of AI.
As a part of this revolution, STRIPS planner gave Shakey the ability to analyse commands (the goals) and break them down into plan of all needed actions (even if Shakey itself wasn’t be able to complete all of them).

STRIPS, classical planning language

But what is the most interesting, representational language used by STRIPS planner has much bigger impact on field of AI than its algorithms and is the base for the most of languages used to describe planning problems.

STRIPS as a classical planning language is composed from states, goals and set of actions:

  • State is a conjunction of positive literals which cannot contain variables and invoke functions.
  • Goal, similarly to the state, is conjunction of positive and ground (no variables and no functions) literals.
  • Actions (also called operators) include preconditions and postconditions. Both represented as a conjunction of function-free literals. Preconditions describe the state of world required to perform action, while postconditions describe state of the world after action is executed.

Example from Shakey the robot paper can be helpful with understanding the basics of STRIPS language. It describes task of fetching box from adjacent room.

Initial state of the world is presented by this image:

Image from Shakey the robot paper, chapter 7: STRIPS


Note: capital letters are constants, while small letters are variable.

As mentioned before, action can be applied only if current state of the world meets all of its preconditions. When it’s applied, literals from postcondition are: added to world state if they are positive, removed from world state if they are negative.

Here is the solution (sequence of actions to achieve the goal, while starting from initial state) for described example:

  1. GOTHRU(D1,R1,R2)
  2. PUSHTHRU(BOX1,D1,R2,R1).

ADL, PDDL — further developments in representational languages

STRIPS language was a good starting point for planning problems representation but there was room for improvements. ADL (Action Description Language) is one of STRIPS extensions which removed some of its constraints to handle more realistic problems. Unlike STRIPS, ADL doesn’t assume that unmentioned literals are false, but rather unknown, what is better known as the Open World Assumption. It also supports negative literals, quantified variables in goals (e.g. ∃x At (P1, x) ∧ At(P2, x)), conditional effects and disjunctions in goals (all not allowed in STRIPS).

STRIPS and ADL were inspiration for another extension of representational languages — PDDL (Planning Domain Definition Language). It was an attempt to standardise planning languages what made International Planning Competition (IPC) series possible. In other words PDDL contains STRIPS, ADL and much more other representational languages.

Thanks to one common language, the planning competition is able to compare the performance of planning systems using a set of benchmark problems. The most important is that by having one formal standard, we are able to compare systems and approaches but also speed up progress in that field. A common formalism is a compromise between expressive power and the progress of basic research (which encourages development from well-understood foundations)¹.

As mentioned at the beginning of this paper, Automated planning and scheduling is one of the major fields of AI. Current development of Machine Learning (Deep Learning) or Computer Vision put Planning in the shade (esp. in news and social media). But having in mind that it’s deeply rooted in our lives (factory robots, intelligent vacuum cleaners, planning agents and more) and its development shapes our future (autonomous cars, drones) — it definitely shouldn’t be ignored.

References, further readings

By Mirek Stanek

Head of Mobile Development at Azimo. Artificial Intelligence adept 🤖. I dream big and do the code ✨.

Leave a Reply

Your email address will not be published. Required fields are marked *