Difference (last change) (Author, normal page display)
F Language, Mark Evans, 2 Jul 2003, Post:D/14183|
F Language, Mark Evans, 2 Jul 2003, NG:D/14183|
Transframe Language, Mark Evans, 29 Mar 2003, Post:D/12285|
TXL, Mark Evans, 19 Feb 2003, Post:D/11050
Typesafe Assembly Language, Mark Evans, 22 Feb 2003, Post:D/11161
Transframe Language, Mark Evans, 29 Mar 2003, NG:D/12285|
TXL, Mark Evans, 19 Feb 2003, NG:D/11050
Typesafe Assembly Language, Mark Evans, 22 Feb 2003, NG:D/11161
Other programming languages can be examined to find useful features to include in D.
(These were originally posted by Mark Evans to the D newsgroup.)
I was amused that CIL is written in OCaml. OCaml just continues to amaze. The CIL license is loose, so this tool might have uses for D.
"Cyclone is a programming language based on C that is safe, meaning that it rules out programs that have buffer overflows, dangling pointers, format string attacks, and so on. High-level, type-safe languages, such as Java, Scheme, or ML also provide safety, but they don't give the same control over data representations and memory management that C does (witness the fact that the run-time systems for these languages are usually written in C.) Furthermore, porting legacy C code to these languages or interfacing with legacy C libraries is a difficult and error-prone process. The goal of Cyclone is to give programmers the same low-level control and performance of C without sacrificing safety, and to make it easy to port or interface with legacy C code."
"Efficiency and flexibility. Dylan can be used as a dynamic prototyping language(like Smalltalk) or an efficient compiled language. Functional Developer, Functional Objects' Dylan compiler, generates code as good as that of most C compilers. Still, it provides an interactive prototyping environment like those found in Smalltalk or Common Lisp."
This design paper is a good read. The lesson for D is to think more about the fast prototyping side and not view flexibility as the enemy of efficiency. Really they are part of the same team and cover different parts of the field.
"Patterns for how to integrate a functional style of programming into an object oriented environment."
Another item showcasing integration of high-level and low-level language paradigms. You can have both.
From the FAQ:
"...have you tried real-time video grabbing and processing in Matlab? Have you tried to train a one-million weight convolutional neural network in Matlab?"
by PETER VAN ROY SEIF HARIDI (c) 2001-2003
"One approach to study computer programming is to study programming languages. But there are a tremendously large number of languages, so large that it is impractical to study them all. How can we tackle this immensity? We could pick a small number of languages that are representative of different programming paradigms. But this gives little insight into programming as a unified discipline. This book uses another approach.
"We focus on programming concepts and the techniques to use them, not on programming languages. The concepts are organized in terms of computation models. A computation model is a formal system that defines how computations are done. There are many ways to define computation models. Since this book is intended to be practical, it is important that the computation model should be directly useful to the programmer. We will therefore define it in terms of concepts that are important to programmers: data types, operations, and a programming language. The term computation model makes precise the imprecise notion of 'programming paradigm'. The rest of the book talks about computation models and not programming paradigms. Sometimes we will use the phrase programming model. This refers to what the programmer needs: the programming techniques and design principles made possible by the computation model.
"Each computation model has its own set of techniques for programming and reasoning about programs. The number of different computation models that are known to be useful is much smaller than the number of programming languages. This book covers many well-known models as well as some less-known models. The main criterium for presenting a model is whether it is useful in practice. Each computation model is based on a simple core language called its kernel language. The kernel languages are introduced in a progressive way, by adding concepts one by one. This lets us show the deep relationships between the different models. Often, just adding one new concept makes a world of difference in programming. For example, adding destructive assignment (explicit state) to functional programming allows to do [sic] object-oriented programming. When stepping from one model to the next, how do we decide on what concepts to add? We will touch on this question many times in the book. The main criterium is the creative extension principle. Roughly, a new concept is added when programs become complicated for technical reasons unrelated to the problem being solved. Adding a concept to the kernel language can keep programs simple, if the concept is chosen carefully. This is explained in Section 2.1.2 and Appendix E.
"A nice property of the kernel language approach is that it lets us use different models together in the same program. This is usually called multiparadigm programming. It is quite natural, since it means simply to use the right concepts for the problem, independent of what computation model they originate from. Multiparadigm programming is an old idea. For example, the designers of Lisp and Scheme have long advocated a similar view. However, this book applies it in a much broader and deeper way than was previously done."
One of the Cyclone people (Greg Morrisett) also works on a "typesafe assembly" project, "TAL." His work should be quite interesting to D, because it centers on implementing high-level (functional) language features in a strongly typed and compiler-optimized fashion -- all the way down to the back end nitty-gritty for Intel chips. We are not talking about virtual machines or JIT here, this is real compilation. Functional power is possible in D. - Mark
"TXL is a generalized source-to-source translation system suitable for prototyping computer language processes of any kind. It has been used in rapid prototyping of new programming, specification and command languages, as well as in a wide range of software engineering and optimization tasks.
"TXL takes as input an arbitrary context-free grammar in BNF-like notation, and a set of show-by-example transformation rules to be applied to inputs parsed using the grammar.
"TXL automatically parses inputs in the language described by the grammar, no matter if ambiguous or recursive, and then successively applies the transformation rules to the parsed input until they fail, producing as output the transformed source."
TXL includes both a C and C++ grammar.
"Today's safe programming languages, like C#, are great for weeding out many bugs in programs. Unfortunately, such languages are not suited to programmers who want to control the memory layout of their data structures or the lifetime of program's resources (like memory). Programmers who need this control are stuck with C or C++. Vault is a safe version of the C programming language, being developed at Microsoft Research, which provides the same level of safety as languages like C#, but allows a programmer to retain control over data layout and lifetime."