This post is an excerpt from the theory part (chapter 2) of book Algorithmics for Hard Problems. Honestly, It is practically not quite useful but provides a mathematical view into algorithms, which is very interesting.
An alphabet is a non-empty, finite set. A symbol is an element in an alphabet .
An finite sequence of elements of an alphabet, the empty word consists zero element.
Of an alphabet , the set of all the words is . is the set of all words over whose length is n.
Alphabet can be used to code graphs (# can be any symbol), since it can program an adjacency matrix ,
Every set is a language over . It is a subset of all the possible words.
To decide if a given input has a prescribed property.
A decision problem is a triple where . An algorithm solves the decision problem if, for every word ,
An optimization problem is a 7-tuple,
is the input alphabet, it is used to encode/represent the problem,
is the output alphabet, it is used to encode/represent the output,
is the language of feasible problem instances, it is all the possible problem instances.
is the language of the (actual) problem instances of ,
is a function , for every , is called the set of feasible solutions for ,
is the cost function, for every pair , where , assigns a positive real number ,
, if , the problem is a minimization problem and a maximum problem otherwise.
For every , a feasible solution is called optimal for and if
For example, can be a word encoding a specific graph by representing the adjacency matrix. Suppose the shortest path from node to is wanted. The set of feasible solutions will be all the connected paths from to . And the goal will be minimization. The optimal solution is the path with shortest distance.