Programming languages complexity
Language's BNF is a rough measure - just for a taste :-)
A few examples,
- C++
- Scheme
- Lua
- Ada
- Haskell
It's not clear to me that complexity is even a well-defined term when applied to a programming language.
If by "objective" you mean "quantitative", you could ask such questions as
How big is an unambiguous grammar?
How big is a working yacc grammar?
Since almost no language has a formal semantics, it's hard to do any quantitative studies. But you could ask
- How big is the simplest interpreter for the language, relative to interpreters for other languages that use the same metalanguage (language in which the interpreter is written)? This measure is somewhat related to Kolmogorov complexity.
Except as a matter of curiosity, it's not clear to me that this question is worth asking—it's hard to imagine useful answers.
Have a look at Denotational semantics and operational semantics:
Denotational semantics is an approach to formalizing the meanings of programming languages by constructing mathematical objects (called denotations) which describe the meanings of expressions from the languages.
The operational semantics for a programming language describes how a valid program is interpreted as sequences of computational steps. These sequences then are the meaning of the program. In the context of functional programs, the final step in a terminating sequence returns the value of the program. (In general there can be many return values for a single program, because the program could be nondeterministic, and even for a deterministic program there can be many computation sequences since the semantics may not specify exactly what sequence of operations arrives at that value.)