Programming language design and implementation
This article needs additional citations for verification. (March 2023) |
Programming languages are typically created by designing a form of representation of a computer program, and writing an implementation for the developed concept,[1] usually an interpreter or compiler. Interpreters are designed to read programs, usually in some variation of a text format, and perform actions based on what it reads, whereas compilers convert code to a lower level form, such as object code.[2]
Design
[edit]In programming language design, there are a wide variety of factors to consider. Some factors may be mutually exclusive (e.g. security versus speed). It may be necessary to consider whether a programming language will perform better interpreted, or compiled, if a language should be dynamically or statically typed, if inheritance will be in, and the general syntax of the language.[3] Many factors involved with the design of a language can be decided on by the goals behind the language. It's important to consider the target audience of a language, its unique features and its purpose.[4] It is good practice to look at what existing languages lack, or make difficult, to make sure a language serves a purpose.[4]
Various experts have suggested useful design principles:
- As the last paragraph of an article published in 1972, Tony Hoare has provided some general advice for any software project:[5]
- "So my advice to the designers and implementer of software of the future is in a nutshell: do not decide exactly what you are going to do until you know how to do it; and do not decide how to do it until you have evaluated your plan against all the desired criteria of quality. And if you cannot do that, simplify your design until you can."
- At a SIGPLAN symposium in 1973, Tony Hoare discussed various language aspects in some detail.[6] He also identifies a number of shortcomings in (then) current programming languages.
- "a programming language is a tool which should assist the programmer in the most difficult aspects of his art, namely program design, documentation, and debugging."
- "objective criteria for good language design may be summarized in five catch phrases: simplicity, security, fast translation, efficient object code, and readability."
- "It is absurd to make elaborate security checks on debugging runs, when no trust is put in the results, and then remove them in production runs, when an erroneous result could be expensive or disastrous. What would we think of a sailing enthusiast who wears his life-jacket when training on dry land but takes it off as soon as he goes to sea?"
- At IFIP Congress 1974, Niklaus Wirth, designer of Pascal, presented a paper "On the design of programming languages".[7] Wirth listed a number of competing suggestions, most notably that a language should be easy to learn and use, it should be usable without new features being added, the compiler should generate efficient code, a compiler should be fast, and that a language should be compatible with libraries, the system it is running on, and programs written in other languages.
Many programming languages have design features intended to make it easier to implement at least the first initial version of the compiler or interpreter. For example, Pascal, Forth, and many assembly languages are specifically designed to support one-pass compilation.
Often new programming languages are designed to fix (perceived) problems with earlier programming languages, typically by adding features that (while they may make the interpreter or compiler more complicated) make programs written in those languages simpler. For example, languages with built-in automatic memory management and garbage collection; languages with built-in associative arrays; etc.
On the other hand, a few programming languages were specifically designed to make it relatively easy to write a self-hosting compiler, typically by deliberately leaving out features that make compilation difficult, such as BCPL, Pascal, and RPython.
Implementation
[edit]There are two general approaches to programming language implementation:[8]
- Interpretation: The program is read as input by an interpreter, which performs the actions written in the program.[9]
- Compilation: The program is read by a compiler, which translates it into some other language, such as bytecode or machine code. The translated code may either be directly executed by hardware or serve as input to another interpreter or another compiler.[9]
In addition to these two extremes, many implementations use hybrid approaches such as just-in-time compilation and bytecode interpreters.
Interpreters have some advantages over JIT compilers and ahead-of-time compilers.[10] Typically interpreters support a read–eval–print loop that makes developing new programs much quicker; compilers force developers to use a much slower edit-compile-run-debug loop.
A typical program, when compiled with an ahead-of-time compiler, will (after the program has been compiled) run faster than the same program processed and run with a JIT compiler; which in turn may run faster than that same program partially compiled into a p-code intermediate language such as a bytecode and interpreted by an application virtual machine; which in turn runs much faster than a pure interpreter.[11]
In theory, a programming language can first be specified and then later an interpreter or compiler for it can be implemented (waterfall model). In practice, often things learned while trying to implement a language can effect later versions of the language specification, leading to combined programming language design and implementation.
Both interpreters and compilers usually implement some sort of symbol table.
Interpreters
[edit]An interpreter is a program that reads another program, typically as text,[4] as seen in languages like Python.[2] Interpreters read code, and produce the result directly.[12] Interpreters typically read code line by line, and parse it to convert and execute the code as operations and actions.[13]
An interpreter is composed of two parts: a parser and an evaluator. After a program is read as input by an interpreter, it is processed by the parser. The parser breaks the program into language components to form a parse tree. The evaluator then uses the parse tree to execute the program.[14]
Virtual machine
[edit]A virtual machine is a special type of interpreter that interprets bytecode.[9] Bytecode is a portable low-level code similar to machine code, though it is generally executed on a virtual machine instead of a physical machine.[15] To improve their efficiencies, many programming languages such as Java,[15] Python,[16] and C#[17] are compiled to bytecode before being interpreted.
Just-in-time compiler
[edit]Some virtual machines include a just-in-time (JIT) compiler to improve the efficiency of bytecode execution. While the bytecode is being executed by the virtual machine, if the JIT compiler determines that a portion of the bytecode will be used repeatedly, it compiles that particular portion to machine code. The JIT compiler then stores the machine code in memory so that it can be used by the virtual machine. JIT compilers try to strike a balance between longer compilation time and faster execution time.[9]
Compilers
[edit]A compiler translates programs written in one language into another language. Most compilers are organized into three stages: a front end, an optimizer, and a back end. The front end is responsible for understanding the program. It makes sure a program is valid and transforms it into an intermediate representation, a data structure used by the compiler to represent the program. The optimizer improves the intermediate representation to increase the speed or reduce the size of the executable which is ultimately produced by the compiler. The back end converts the optimized intermediate representation into the output language of the compiler.[18]
If a compiler of a given high level language produces another high level language, it is called a transpiler. Transpilers can be used to extend existing languages or to simplify compiler development by exploiting portable and well-optimized implementations of other languages (such as C).[9]
Many combinations of interpretation and compilation are possible, and many modern programming language implementations include elements of both. For example, the Smalltalk programming language is conventionally implemented by compilation into bytecode, which is then either interpreted or compiled by a virtual machine. Since Smalltalk bytecode is run on a virtual machine, it is portable across different hardware platforms.[19]
Multiple implementations
[edit]Programming languages can have multiple implementations. Different implementations can be written in different languages and can use different methods to compile or interpret code. For example, implementations of Python include: [20]
- CPython, the reference implementation of Python
- IronPython, an implementation targeting the .NET Framework (written in C#)
- Jython, an implementation targeting the Java virtual machine
- PyPy, an implementation designed for speed (written in RPython)
Process
[edit]Processes of making a programming language may differ from developer to developer; however, here is a general process of how one might create a programming language, which includes common concepts:
- Design: Design aspects are considered, such as types, syntax, semantics, and library usage to develop a language.[21]
- Consideration: Syntax, implementation, and other factors are considered. Languages like Python interpret code at runtime, whereas languages like C++ follow an approach of basing its compiler off of C's compiler.[22]
- Create an implementation: A first implementation is written. Compilers will convert to other formats, usually ending up as low-level as assembly, even down to binary.[23]
- Improve your implementation: Implementations should be improved upon. Expand the programming language, aiming for it to have enough functionality to bootstrap, where a programming language is capable of writing an implementation of itself.
- Bootstrapping: If using a compiler, a developer may use the process of bootstrapping, where a compiler for a programming language is rewritten in itself.[24] This is good for bug checking, and proving its capability.[25] Bootstrapping also comes with the benefit of only needing to program the language in itself from there-on.
References
[edit]- ^ Tomassetti, Federico (8 May 2019). "How would I go about creating a programming language?". Strumenta. Retrieved 3 March 2023.
- ^ a b "Compiler vs Interpreter". Geeks For Geeks. 17 January 2022. Retrieved 3 March 2023.
- ^ "Programming Languages and Learning". Washington EDU. University of Washington. Retrieved 2 March 2023.
- ^ a b c "How are Programming Languages created". GoNoCode. 8 December 2021. Retrieved 2 March 2023.
- ^ Hoare, C. A. R. (1972). "The Quality of Software". Software: Practice and Experience. 2 (2): 103–105. doi:10.1002/spe.4380020202. S2CID 62662609.
- ^ "Hints on Programming Language Design" (PDF). 1973. Retrieved 7 March 2023.
- ^ "On the design of programming languages" (PDF). 1974. Retrieved 9 March 2023.
- ^ Ranta, Aarne (February 6, 2012). Implementing Programming Languages (PDF). College Publications. pp. 16–18. ISBN 9781848900646. Archived (PDF) from the original on Nov 7, 2020. Retrieved 22 March 2020.
- ^ a b c d e Baker, Greg. "Language Implementations". Computing Science - Simon Fraser University. Archived from the original on Mar 8, 2019. Retrieved 22 March 2020.
- ^ KernelTrap. "More Efficient Bytecode Interpreters Instead of Just-in-Time Compilation".
- ^ Larry Fish. "The Story Behind Apex/XPL0 and the 6502 Group".
- ^ Diver, Laurence (7 December 2021). "Published on Dec 07, 2021 Interpreting the Rule(s) of Code: Performance, Performativity, and Production". MIT Computational Law Report.
- ^ Rathi, Mukul (31 March 2017). "How I wrote my own "proper" programming language". mukulrathi. Retrieved 2 March 2023.
- ^ Evans, David (19 August 2011). Introduction to Computing (PDF). University of Virginia. p. 211. Retrieved 22 March 2020.
- ^ a b Sridhar, Jay (Aug 29, 2017). "Why the Java Virtual Machine Helps Your Code Run Better". MakeUseOf. Retrieved 22 March 2020.
- ^ Bennett, James (April 23, 2018). "An introduction to Python bytecode". Opensource.com. Retrieved 22 March 2020.
- ^ Ali, Mirza Farrukh (Oct 12, 2017). "Common Language Runtime(CLR) DotNet". Medium. Retrieved 22 March 2020.
- ^ Cooper, Keith; Torczon, Linda (7 February 2011). Engineering a Compiler (2nd ed.). Morgan Kaufmann. pp. 6-9. ISBN 9780120884780.
- ^ Lewis, Simon (May 11, 1995). The Art and Science of Smalltalk (PDF). Prentice Hall. pp. 20–21. ISBN 9780133713459. Retrieved 23 March 2020.
- ^ "Alternative Python Implementations". Python.org. Retrieved 23 March 2020.
- ^ Chouchanian, Vic. "Programming Languages". California State University Northridge. Retrieved 2 March 2023.
- ^ Stroustrup, Bjarne. "A History of C ++ : 1979− 1991" (PDF). Archived (PDF) from the original on 2 February 2019. Retrieved 18 July 2013.
- ^ Ferguson, Andrew. "A History of Computer Programming Languages". Brown University. Retrieved 2 March 2023.
- ^ Glück, Robert (2012). "Bootstrapping compiler generators from partial evaluators". In Clarke, Edmund; Virbitskaite, Irina; Voronkov, Andrei (eds.). Perspectives of Systems Informatics: 8th International Andrei Ershov Memorial Conference, PSI 2011, Novosibirsk, Russia, June 27 – July 1, 2011, Revised Selected Papers. Lecture Notes in Computer Science. Vol. 7162. Springer. pp. 125–141. doi:10.1007/978-3-642-29709-0_13.
Getting started presents the chicken-and-egg problem familiar from compiler construction: one needs a compiler to bootstrap a compiler, and bootstrapping compiler generators is no exception.
- ^ "Installing GCC: Building". GNU Project - Free Software Foundation (FSF).
External links
[edit]
Media related to Compiling and linking at Wikimedia Commons