Talk:Metacompiler

From WikiProjectMed
Jump to navigation Jump to search

RFC

The following discussion is closed. Please do not modify it. Subsequent comments should be made on the appropriate discussion page. No further edits should be made to this discussion.


Revised 11/3/2014 The description:

The feature that sets a metacompiler apart from a standard compiler-compiler is that a metacompiler is written in its own language and translates itself.

Revised 11/17/2014

There is some confusion about this RFC. It is basicly about the claim; that that a metacompiler is written in its own language and translates itself.

There are many metacompilers that are not written in their own language nor do they translates their self.

META I, Val Schorre, hand coded was used to compiler META II.
Meta III,Schneider and Johnson, was implemented completely in assembly language.
Bookl, Erwin Book, written in LISP
Book2, Erwin Book, written in LISP
SIMPLE written in PL/I
CGS, Warshall and Shapiro, written in CL-1.

The following statement has confused the issue here.--Steamerandy (talk) 13:33, 18 November 2014 (UTC)[reply]

Is an unsupported attribution claim made by FORTH programmers. It is disputed by the Forth (programming language) topic here on Wikipedia. See Wikipedia:Manual of Style/Words to watch Unsupported attributions.

Sorry if the Forth metacompiler claim above has confused the issue. It seams the originator of this topic is an advocate of the FORTH metacompiler claim. The other part of the RFC was to links used in explaining the self-compiling operation. it followed the FORTH description of a metacompiler. Computer science terms were used an linked to their topic here on Wiki. The links and description were gibberish. "A metacompiler is defined by a set of grammar production rules defining itself, written in its own specialized language" Grammar was linked to Formal grammar and production rules linked to Production (computer science). Both are bacicly formal grammer topics. The production link describes the formal process of build strings of a formal grammar. It is gibberish as there is no semantic process in the description, but is being implied by the phrase "production rules". Many metacompiler take as input analytic grammars and not production (formal) grammars. --Steamerandy (talk) 23:14, 17 November 2014 (UTC)--Steamerandy (talk) 13:39, 18 November 2014 (UTC)[reply]

After the RFC was posted references were added to this line:

Defining itself and translating itself this way constitutes the meta-step that sets a metacompiler apart from other compiler-compilers.

The references pointing to forth sites claiming forth to be metacompilers. These are unsupported claims made by forth programmers that are not supported in main stream Computer Science.. There is no such thing as a meta-step in Computer Science as described by fringe Forth language proponents. I cannot find any reference to the term meta-step in any Computer Science publications. It is only found on some Forth programming sites. When there was a SegForth group in the ACM it was supported by the SegPlan group and made up of less then 3% of the SegPlan group. There is a fringe group of Forth fanatics having the misguided conception that forth is a metacompiler. The norm is that most compilers are self hosting, compiling them selves. See Self-hosting compilers.

I feel that this description of a metacompiler is only used the FORTH fringe. And thus all references to self-compiling being the feature that sets metacompilers apart from other compiler compilers be removed.

A more defining definition should be along the lines of metacompiler compiling a metalanguage. The description of a metacompiler compiling a metalanguage is stated in ACM documents that can be found in the ACM archive. Other document on meta compilers also saying the same are: TREEMETA documents from Stanford Research Institute. And BOOK1, BOOK2, CWIC from Systems Development Corporation.

My opinion is that this article is the unsubstantiated opinion of one Damon Simms. It has conflicting description with the well known metacompilers and would exclude them as they are analytical parser languages not using a generative parser as stated in the article. All references given are to the analytical parsers, None of the metacompilers talked about here have a generative syntax description language. Our main argument is the idea of a metacompiler compiles it self. There is no example of even one such compiler having a generative syntax language given here.

So please side with the removal of the Forth fringe misconceived idea of a metacompiler.


His response to my suggested changes: "I seriously doubt you understand this topic and associated concepts. Any changes you make to the Metacompiler Wikipedia page I will consider an act of ignorant vandalism, and will report it as such"

"I'm tired of ignoramuses coming on here who know a little somethin about something and think they know everything"

Note I asked for a list of metacompilers. Didn't get it. No references to his claims.

I am sorry I have to make this RFC. But with the threat of being reported for vandalism I must seek outside opiniones.

--Steamerandy (talk) 04:46, 4 November 2014 (UTC)[reply]

Orrigional made --Steamerandy (talk) 00:52, 8 October 2014 (UTC)[reply]

Response to RFC claims

The person filing the RFC, Steamerandy, first started editing this topic 3 weeks ago, and his additions were accepted, even though they appeared to be erroneous. Unfortunately the documentation to challenge his changes is now classified/secret by the US government and therefor unavailable -- seriously, I'm not making this up, CWIC became a classified project of SDC, System Development Corporation, a research company in Santa Monica that did a lot of work for the military. From Steamerandy's own Talk page:

"CWIC was developed at SDC a government think tank. It was classified by the government some time in the early 70s. But I got my manual from Erwin at an ACM meeting before it was classified."

So he made changes based on a manual none of the rest of us have access to, but hey, additional information on this topic is always appreciated.

A couple weeks passed and I saw he was making changes to other Wikipedia pages related to this topic. All fine and good, Wikipedia can always use knowledgeable editors. And he also studied and learned the Wikipedia rules.

But then he started insisting that this article was wrong, challenged the title and thought it should be called "Meta compiler" instead of as it is now, "Metacompiler" as a single word. He repeatedly provided his biography at length, in several places here on Wikipedia on the different Talk pages, as indication that he is qualified to speak for this topic. He claims he has done a lot of work with this technology, a vast amount of expertise. And he began wanting to change everything about the Metacompiler article based on his views of reality.

His expertise is largely based on a dead project from 50 years ago, and documentation now classified as secret and therefore unavailable to any of the rest of us. I call it a dead project, because after the government made it Classified, no more information about the project became available to the public. We're pretty sure it still continued as research, but who knows? But others have continued to carry on research into metacompilers over the years since then which isn't classified. Metacompilers play an important part in Domain Analysis and domain-specific languages, for example.

About a week ago Steamerandy posted some responses on this talk page ridiculing a technical answer I had posted to someone else's question. I responded, based on standard computer science concepts which can be found here on Wikipedia (bootstrapping was the topic). Not only did he stand by his ridicule of my responses, he posted his technical biography as proof he knew what he was talking about, and to claim I was wrong.

Not only that, he also showed up on my User Talk page, giving again his long technical biography, past glories and so on, and then proceeded to imply that I didn't know what I was talking about, and additionally saw fit to insult and belittle myself and my experience. He wrote "Looks like I was programming before you were born" and "You youngsters just think you know everything!!" (see https://en.wikipedia.org/wiki/User_talk:Damon_Simms). By the way, I am his age and have almost as many years of active work in the computer field as he does, over 40.

I don't mind someone challenging what I write based on superior information, but he seemed intent on denigrating and denying anything I wrote based on my background in Computer Science, which includes Bachelors and Masters degrees in Computer Science. Everything I write here is based on my education and 40 years of experience, both in academia and industry. I maintain contact with experts in the field and and check with them to vet what I have written for this article.

I continually tried to respond to Steamerandy's complaints by citing references here on Wikipedia and elsewhere on the web. His belligerence seemed to escalate. He continually muddled and misrepresented my responses and references. At one point he seemed to start researching on Wikipedia, dragging up a mish-mash of related topics which he claimed refuted what is stated in this article. He keeps throwing in random references to theory articles and concepts which he claims counters what I have written here. He even disputes basic terminology, like metacompiler and bootstrapping, which have been standard terms of art in the computer field for at least 50 years.

Steamerandy just wants to get his way. He is a very strong proponent of the Schorre family of metacompilers and seems intent on making this article just about that narrow group of metacompilers. There is a long, albeit it thin, line of research and real-world application of metacompilers in Software Engineering. He even proposed changing the title of the article to "Meta programming" to make it solely about that Schorre family of metacompilers. At first I didn't understand where he was coming from, but now I think I understand his confusion.

The first two metacompilers made by Schorre were called "Meta" -- Meta-I and Meta-II. I believe Schorre named his metacompiiler "Meta" after the term "metacompiler", just as Microsoft named their word processing program "Word". It seems Steamerandy believes the opposite, that "metacompiler" is named after those first programs of Schorre and there is nothing special about the term "metacompiler". He completely denies all the evidence I have produced backing up the idea that "metacompiler" is a basic concept long accepted in Computer Science and Compiler Development and Software Engineering.

For example, the Forth community has a long history of using their version of a metacompiler to bootstrap their language system onto new hardware. A link to one of the many Forth websites makes that clear:

http://www.forthfreak.net/index.cgi?MetaCompiler -- the Forth Metacompiler
from that page:
meta: (greek) A prefix, meaning "one level of description higher"
A metacompiler is a compiler which processes its own source code, resulting in an executable version of itself. Many Forth-systems have been written as metacompilers. Forth metacompilers can reduce the porting effort, especially to badly supported platforms, by avoiding the need of assemblers, compilers or tool chains, in order to get a program running on that platform.

Here the Forth community uses the term "metacompiler" as it has been accepted for 40 years. Yet Steamerandy seems to claim the Meta family of metacompilers from Schorre are the only metacompilers, and even disputes the word "metacompiler" as a vaild term and concept in Computer Science. When I presented him with 2 examples of the Forth metacompilers, he confused 4 different referenced products and claimed they weren't really metacompilers. In doing so, he completely ignored the reference I included above, which directly contradicts everything he seems to be claiming here.

I usually welcome knowledgeable comments from contributors, but having read Steamerandy's posts and his proposed changes to this article, this RFC concerns me greatly. He seems intent on imposing his limited views, he won't accept the fact that others have also created metacompilers, and he continues to deny concepts long accepted in Computer Science -- it's almost as if he believes he's the only expert in this field, and if he hasn't heard of something then it must not exist or is bogus.

For example, his first claim from his RFC request above, which he seems to repeat over and over, even after I have presented counter-evidence:

the description "The feature that sets a metacompiler apart from a standard compiler-compiler is that a metacompiler is written in its own language and translates itself" is in question

I believe the quotes from that Forth webpage above satisfies that complaint. It clearly states that a metacompiler "is a compiler which processes its own source code, resulting in an executable version of itself". Furthermore, it also disputes his definition of the prefix "meta-", which he keeps claiming over and over that we got wrong here, thus nullifying our use of the term "metacompiler". The Forth reference also shows that the Forth community has its own line of metacompilers, satisfying his repeated request to show any other example of metacompilers besides the ones he's pushing. Despite providing this evidence, Steamerandy continues to deny and obfuscate this obvious data countering his claim.

As for his second claim in his RFC request, that I am excluding the Meta-II metacompilers, is based on his misunderstanding of well-accepted terminology from compiler theory and practice. He claims:

"A metacompiler is defined by a set of grammar productions defining itself" defines a meta compiler as using a generative grammar. A great many meta compilers use analytical grammars.
My opinion is that this article is the unsubstantiated opinion of one Damon Simms. It has conflicting description with the well known metacompilers and would exclude them as they are analytical parser languages not using a generative parser as stated in the article.

I believe he misconstrues the meaning of "grammar productions" to mean something having to do with generative grammars instead of analytical grammars. In fact, the term "grammar production" is a term of art from computer science referring to transformation rules, which are used in all kinds of grammars. It has nothing to do with "generative" anything, as far as I know. I modified that statement in the article to help maybe any future confusion, by changing "grammar productions" to "grammar production rules". See Wiki page Production (computer science).

So this is a common pattern: Steamerandy misconstrues or misunderstands what is written in the article or in my responses to him, and then based on that misunderstanding claims it is wrong. He then dredges up a mish-mash of stuff from other Wikipedia pages, posts it as a response on this Talk page, and then claims it proves his point. He then uses that as a basis for wanting to change everything about this article to make it about his own pet projects and interests.

I'm also concerned because so far his writing has been atrocious -- he muddles things together, sometimes almost to the point of gibberish. I have degrees in this stuff, and it seems he conflates formal theory and parsing technology, cutting and pasting pieces out of Wikipedia articles, claiming they prove his point, when in fact it is cobbled-together nonsense. How do you respond to that?

But additionally, I have seen over and over that he makes mistakes, atrocious mistakes, in his grammar and his spelling and his examples. When I can understand them, there also seem to be flaws in his claims and his arguments. And when provided with facts disputing what he says, he usually denies or obfuscates the facts, but sometimes he changes his position, incorporating the new facts as if he had accepted them all along, but then finds other reasons to find fault in the information presented here.

It's almost as if it's important to him to show that I am somehow wrong about something. His latest attempt at this is this RFC, an attempt at replacing what I and others have written here with his own hacked together parochial view of this topic, with emphasis on his own pet technology, the Meta-II brand of metacompilers -- to the detriment of other equally important and valid technology. It's a way to seek your approval and proof that he is right and I am wrong. For me this is not personal, the only skin I have in this game is that I want to see the work of Schorre and others made available and accessible too. He continually claims that my description of metacompilers invalidates the Meta-II family of programs he is so fond of, when in fact it is just the opposite. He continually misconstrues the facts, misrepresents the theory and its implications, and frankly I don't think he understands it. And on top of that, he constantly misrepresents what I have written with his mish-mash of compiler theory just to prove I'm wrong and I don't know what I'm talking about. I've taken both undergraduate and graduate classes in compiler theory, as well as 2 practicum classes in which we built little mini compilers for the experience and to see how it's really done. And then later I encountered Meta-II and its derived metacompilers, which were barely touched on in my academic classes, and I was overwhelmed how simple and powerful the concept is. I have used metacompiler tools in my software engineering work for the past 30 years. That's why I promote this technology. And that's why I don't want to see the article here muddled by a self-promoting retired tech person who wants to make it about their own personal history.

If you read my comments you will see that I have at times been gruff in my responses, but the guy shows up and immediately insults me on my own Talk page, ridicules the topic and the article, ridicules sincere responses I have posted on this Talk page, and blusters about how everything should be done differently because he has more experience and a high IQ, or something like that. And he writes badly to boot. And from my perspective, is ignorant of all that has gone on in computer science and compiler technology since that dead project 50 years ago.

I appreciate the fact that Steamerandy follows the Wikipedia rules and is contributing. He is obviously passionate about the topic, albeit in a very narrow-minded tunnel-vision way. I have even offered multiple times to help him with his current project he calls cc (for compiler compiler), and offering links to webpages which can provide him with even more help. But so far he only seems intent on changing what's here and proving me wrong.

Let me summarize:

  • Steamerandy claims this article should be changed because statements in the article are wrong
  • I have provided multiple references to counter his claims, which he ignores or misconstrues
  • He wants to change the article to a more narrow explicit focus that will be about one instance of the technology, his favorite
  • He doesn't even like the article topic, disputing that it is a real topic
  • He has even stated the whole article should be deleted based on his insistence that other metacompilers don't exist
  • He cobbles together bits and pieces from related articles on computer theory and research to make his point, but often they don't make sense, at least not to me (hey, what do I know, I just have 2 degrees and 40 years experience in this stuff)
  • When provided with references that counter his claims -- such as the existence of other metacompilers, or examples of the use of "metacompiler" as an accepted concept that exists apart from his own narrow definition -- he ignores or muddles the evidence, or explains them away with muddled logic
  • He keeps trying to show that key points in the article are wrong, but as far as I can tell it's based on his misreading of the info and a failure to keep up in this topic over the years, except for his own little narrow slice of the technology -- I have encountered his type before, if he doesn't know about something, then it must not exist, even feeling he can make fun of anyone associated with it
  • He keeps posting his own technical history all over Wikipedia, on the Talk pages of related articles -- I assume as proof of his authority in this subject, but I'm not sure and it does make me wonder

I will add more information here if I think of it.

Please reject his RFC, as I think it is based on an unwillingness to consider evidence of the work of others, and an insistence on a narrow parochial view which denies anything that falls outside that view. Thanks.

Damon Simms (talk) 14:44, 10 October 2014 (UTC)[reply]


Counter response to adversarial: Response to RFC claims

The adversary here appears to have their own agenda to push the forth minority definition of metacompiler. What makes me think this is the comment he made:

Forget about Meta, cast it into Hell... for God's sake, learn Forth.

My Adversary says FORTH is a metacompiler.

Really he said:

The forth community uses the term "metacompiler" as it has been accepted for 40 years.

Ok. So I searched for forth metacompiler. The forth concept of meta compiler is simply a compiler that compiles it's self. From :Forth (programming language), www.forthos.org and other referanced sites :

* When forth is coded in forth, the process of building a new forth system is called metacompilation. Like most metacompilers, ForthOS's has restrictions on what kinds of Forth code are permissable. Forth (programming language) www.forthos.org www.forth.orgwww.forth.org metacompiler Journal of FORTH Application and Research, Volume 4 Issue 2, June 1986 Pages 257 - 258 'And here on Wikipedia: forth meta compilation
* The LMI Metacompiler is written in Forth-B3 and has minimal dependence on the word size or structure of the host or target systems,
* The input to the (LMI) Metacompiler may be either traditional forth screens or ordinary text files, Source files may "include" other files, up to 4 levels of nesting.

When I checked the forth reference here on Wikipedia I found that forth compiling it's self is not recognized in computer science mainstream as a metacompiler or as doing a meta-compilation.

Wikipedia: Forth (Self-compilation and cross compilation) "A fully featured Forth system with all source code will compile itself, a technique commonly called meta-compilation by Forth programmers (although the term doesn't exactly match meta-compilation as it is normally defined)."

So the forth metacompiler takes forth language as input, and is written in forth. How is this different from a PASCAL compiler written in PASCAL, or a C compiler written in C?

When PASCAL is coded in PASCAL , the process of building a new PASCAL compiler is called metacompilation. Like most metacompilers, PASCAL has... NOT

My adversary appears to not have investigate his sources. He says I misconstrue his references when I find in his references exactly the opposite of what he says is the defining attribute of a metacompiler. I misconstrue his references finding that they are either not talking about a metacompile that matches his narrow view or are not a metacompiler at all.

It just seams that I misconstrue his references by actually reading them.

So what have I found in references my adversary gave:

forth freaks consider ForthOS a meta compiler.
* When forth is coded in Forth, the process of building a new forth system is called metacompilation. Like most metacompilers, ForthOS's has restrictions on what kinds of forth code are permissable. www.forthos.org
* The LMI Metacompiler is written in Forth-B3 and has minimal dependence on the word size or structure of the host or target systems, "www.forth.org""forth.org metacompiler" www.forthos.org
* The input to the Metacompiler may be either traditional forth screens or ordinary text files, Source files may "include" other files, up to 4 levels of nesting. "www.forth.org""forth.org metacompiler" www.forthos.org

So my adversary gave referances to forth sites, thinking they prove his point. I misconstrue his references when I actually study them. Looking at the compiler code. Hours of researching the so called forth metacompiler. Never finding one. Instead I find that forth freaks consider forth a meta compiler.

Can it be that my adversary gets his idea of a metacompiler from forth? He did tell me "Forget about Meta, cast it into Hell... for God's sake, learn Forth."

My adversary keeps saying I wish to narrow this topic to Schorre's work. But has never shown any actual case.

The history of metacompilers is closely tied to the history of SIG/PLAN Working group 1 on Syntax Driven Compilers. The group was started in the. Los Angles area primarily through the effort of Howard Metcalfe.Cite error: A <ref> tag is missing the closing </ref> (see the help page).[1][2]

Meta is used to mean about or describing, as in metadata (about data). A language that is used to describe other languages is a metalanguage. English can be considered a metalanguage. BNF, Backus–Naur Form, is a formal metalanguage originally used to define ALGOL 60. BNF is a weak metalanguage, for it describes only the syntax and says nothing about the semantics or meaning. English on the other hand is powerful, Yet its informality prohibits its translation into computer programs.
A meta-compiler is a program that reads a metalanguage program as input and translates that program into a set of instructions. If the input program is a complete description of a formal programming language, the translation is a compiler for the language.
A metacompiler is, it's self, usually defined in a metalanguage. This metalanguage can then be compiled by a metacompiler, usually it's self, or hand compiled into an existing computer language. The step of first writing a matacompiler by hand compiling into an existing language is called bootstrapping. That is, the programmer pretends to be the metacompiler, parsing its rules and writing the code the metacompiler would generate, if it existed.

As anyone can see above. Not one thing specific to Schorre meta compilers in the definition I proposed. Parsing expression grammars are used in many of the known metacompilers. I noted this about Schorre languages and there are several web pages also saying the same thing. It does limit metacompilers to a class of parsers. But not to Schorre specificly ss my adversary is saying. I have no objection to changing that specific limitation. My intention is not to limit the parser type.

But the real jest of that change is relegating the self compiling to a much lower level. As the 40+ year old definition of a metacompiler does not even include compiling it's self. Maybe I wasn't clear in my proposed change. I wasn't proposing to change the other information on the page. Just the definition part pertaining to the meaning of metacompiler. The other uses I did not intend to change. The explanation that meta compilers can be used for other functions would be unchanged. I am not really changing the actual meaning all that much. It's more a rearrangement making the discriminating attribute: taking a metalanguage as input that specifies or directs the transformation the input language into another form, which may be executable code.

--Steamerandy (talk) 05:52, 19 October 2014 (UTC)[reply]

revised proposed change to metacompiler topic

The misguided idea: "The feature that sets a metacompiler apart from a standard compiler-compiler is that a metacompiler is written in its own language and translates itself." is not a defining feature of a metacompiler. Many many compiler are written in the language they compiler. C++, C, ALGOL and many compilers have been written in the compiled language. bootstrapping a self-compiled compiler in any language is no different then the bootstrapping process claimed to be so special in the original definition. The real discerning feature is the metalanguage input. In a metacompiler the metalanguage directs the compilation process, defining the syntax of the language and it's transformation to output code. The metalanguage is a programming language that actually directs the compiling process. This clamoring about self compilation is detracting from the essence of the elegant solution they offer to compiler writers. I hope the following achieves that goal:

Metacompilers are a special class of Syntax-directed compiler compiler. Compiling a metalanguage, description of the target language, that are a combination of syntax parsing rules and semantic production rules or elements.[3][1][2][4]
Meta is used to mean about or describing, as in metadata (about data). A language that is used to describe other languages is a metalanguage. English can be considered a metalanguage. BNF, Backus–Naur Form, is a formal metalanguage originally used to define ALGOL 60. BNF is a weak metalanguage, for it describes only the syntax and says nothing about the semantics or meaning. English on the other hand is powerful, Yet its informality prohibits its translation into computer programs.
A meta-compiler is a program that reads a metalanguage program as input and translates that program into a set of instructions. If the input program is a complete description of a formal programming language, the translation is a compiler for the language. Thus one could say the metacompiler is a contraction of metalanguage compiler.[4]
A metacompiler is, it's self, usually defined in a metalanguage. This metalanguage can then be compiled by a metacompiler, usually it's self, or hand compiled into an existing computer language. The step of first writing a matacompiler by hand compiling into an existing language is called bootstrapping. That is, the programmer pretends to be the metacompiler, parsing its rules and writing the code the metacompiler would generate, if it existed. Bootstrapping may continue incrementally creating more and more powerful compiler by adding features creating a new more powerful compiler.
Metacompilers are not only useful for generating parsers and code generators, they are also useful for generating a wide range of other software engineering and analysis tools.[5]
Besides being useful for parsing domain-specific languages, a metacompiler is itself a prime example of a domain-specific language, designed for the domain of compiler writing.
A full development package would include a linker and a run-time support library. Usually a machine oriented language is required for writing the support library. Today C or C++ might be used as a machine oriented language. A library consisting of support functions required for the translation process usually rounds out the full metacompiler package. This might for instance include memory management, input/output, symbol table, and string processing functions.

Note. this is only to replace the first part of Metacompiler topic. The historic content and below is not to be changed. Mote Need to find a more suitable string processing description. For the most part string processing has to do with comparing constant static strings against the input. Symbols are commonly a type of string. Symbols are a type of string and may have string attributes.

I think that the only reason Damon Simms might not wont this definition is that it excludes forth from the club. Or any compiler compiler not taking a metalanguage as input.

Continued Counter response to adversarial: Response to RFC claims

In my original proposed change I was not clear on what was to be replaced. There was no intention of replacing the whole of what was there. Just the part that defined a metacompiler.

My adversary will say that I am changing my position again. Sorry the "dumbshit" "ignoramuses" calling adversary blows so much smoke trying to cover up his real objections: That my definition would exclude forth from the metacompiler club. Note the Wikipedia take in the forth Self-compilation being a metacompiler

My adversary limits metacompiler to production grammars by his description ignorantly linking to formal grammar. And then claims I am the one who doesn't under the grammar terms. Minor point. But shows Damon Simms lack of understanding of the mark-up language used or his lack of understanding of computer science compiler terminology. He says he knows the lingo and I do not.

I have pointed out to my adversary that his own references talk about metacompilers implemented in other languages. One in assembly and another implemented in PL/1. These were not written in them selves. They do not compiler them selves. The one written in assembly is not a metacompiler but an implementation of forth. Which I discovered by actually reading the reference my adversary gave.

forth is not a meta compiler!!! And no amount of name, "dumbshit" "ignoramuses", calling from my adversary is going to change that.

To be honest I can not say if my adversary is intentionally biased to the forth metacompiler defamation or just did not investigate his sources.

My adversary denies that my experience as head of the language development group at Kontron Electronics were we actually wrote commercial compilers is immaterial. I used my experience as an example of how ludicrous his position is that "bootstraping a compiler from nothing is mind-bending",

From my own personal experience, writing many compilers, starting from scratch is business-as-usual. Writing any program from scratch were there is no prior art is a more daunting task. So what!!

All of the metacompilers I have worked with could compile them selves. But so could all of the PASCAL compilers I have written. What is the difference of a PASCAL compiler written in PASCAL compiling it's self and META II compiling it's self. Is there really any difference in bootstrapping a PASCAL compile, Writing a P-code interpreter from scratch. Hand compiling a PASCAL compiler into p-code. Compiling the PASCAL compiler that was hand coded into P-code using the hand coded compiler. Modifying the PASCAL compiler to generate assembly language and creating a run time support library. bootstrapping the original P-code PASCAL compiler into a machine executable compiler. Further modifying the compiler adding linkage declaration making it modular. The PASCAL compiler has got all the same attributes that my adversary claims to make a meta compiler different from other compiler compilers except it does not take a metalanguage as input. I forgot a step. A P-code assembler had to be written to assembly P-code assembly source into a P_code binary load module. My adversary says I do not understand bootstrapping. It seams to me that my adversary is the one that does not understand bootstrapping and is just mouthing material he does not understanding.

Writing a pascal from scratch is really no different from writing a metacompiler from scratch. In fact META II was written on a pseudo machine (interrupter) having only four instruction. How hard can it be to write code for a 4 instruction machine.

Read the response made to "Can we remove "and usually mind-bending"" NOTE. it was from an IP. but my adversary has said it was his response.

From my own experience and from interviewing many others in computer and software engineering, the process of bootstrapping is the part that is mind-bending. Unlike other iterative software development which just builds successive improvements and extensions on an existing base of software, in bootstrapping you start with nothing and have the software generate itself.

Damon Simms said:

About a week ago Steamerandy posted some responses on this talk page ridiculing a technical answer I had posted to someone else's question. I responded, based on standard computer science concepts which can be found here on Wikipedia (bootstrapping was the topic). Not only did he stand by his ridicule of my responses, he posted his technical biography as proof he knew what he was talking about, and to claim I was wrong.

What does the bootstrapping topic here on Wikipedia have to say:

Bootstrapping can also refer to the development of successively more complex, faster programming environments.

There is more there about developing a compiler in one language and then rewriting it in it's own language etc. Just follow the bootstrapping link and see what it has to say.

NOTE Again my adversary who does not understand computer science terms keeps saying it is I who does not understand them. Iterative development is bootstrapping. Who's the "dumbshit" "ignoramuse" misunderstanding computer science terminology? The Bootstrapping (compilers) topic is controversial. Seams to have more objection then parsing expression grammar.

The Student Who Became a Professor

Years ago I had a student come to me for help with a personal project. A simple command line calculator. It had to handle numeric algebraic expression hierarchy. This was on a DEC-10. So we set down and coded it up in assembly. Took about a half hour explaining exactly every steep. First imputing a digit string and converting it to binary. And a output routine with a bit a glue code for testing inputting a number and outputting the result. The DEC-10 was a unique machine having 16 36 bit registers. Any of which could be used as a stack pointer. Even as a call stack. So it was a simple mater to write a recursive parser. It basically input numbers and recognizing operators calling functions implementing a recursive decent parser. Anyway he was amazed. That amazement lead him to getting his degree. Years later I visited my old collage and he was teaching their. So we talked a bit. And the old calculator project came up. He said it took years to fully understand it to the point he could do it. He was using it in teaching parsing to his students. He no longer thought it so amazing. A fellow programmer I worked with wrote a binary tree balancer that is almost incomprehensible. This was also on a DEC-10. It used two separate call stacks. It had two co-function that called each other. Each function called the other function or it's self on one of the two stacks depending where it was at in the code. It worked great. I used it. I figured it out but have forgotten the specifics. Now that was mind-bending. Following the code one routine would return on a different stack then it was called on. Now that was really mind bending. But hay it worked.

The Awe is Gone

Why did I talk about the student above. I think it explains why my adversary finds writing a compiler from scratch so mind bending. He is like that student seeing it for the first time. Which I think should not be the position from which the metacompiler definition should be written. The amazing thing is the ease in which a compiler or some other language processing application can be written using a metacompiler. And by the way the calculator writing episode happened over 40 years ago. It didn't take me a life time to learn how to write compilers. I have been writing compilers for over 44 years.

What Is Unique About Metacompilers That Distinguishes Them From The Rest

I believe that taking metalanguage source as input is the one single differences that separates a metacompiler from the rest. The fact that a metacompiler takes a metalanguage that not only defines the syntax of a language, it also defines the code produced by the language constructs. That is what separates it from the rest of the so called compiler compilers. It is the metalanguage programming that makes a metacompiler different from say a C++ compiler written in C++. The metalanguage, if designed right, is a readable description of the parsing and code generation processes. There in a language designed to express a readable language compilation. That is amazing. It's amazing that you can control the tree generation. It's a simple matter to generate a right or left handed tree. The difference is in how the rule is written. Using recursion or a loop construct. You do not have to change a complicated parser written in C or some unsuited language for the task. But that may be to specific to Schorre's work. The analytic grammar used in Schorre metacompilers is very readable.

This confusion started with yacc (Yet Another Compiler Compiler) calling it's self a compiler compiler. It really being a parser generator. As most so called compiler compilers are. Maybe yacc and the rest of the parser generators should be kicked out of the compiler compiler club. It would make compiler compilers be what it sounds like it should be. A compiler producing an executable compiler from a source language description. If you go back through the early compiler development you find many compiler compilers that actually produced machine code output. Years before yacc.

Having never written a commercial compiler my adversary keeps trying to dump on me. I am sure there are many game developers around who have used one of my compilers. Kontron development system were used to write many Atari games. Using the 6502 assembler or PASCAL cross compiler. I was head of the language development group at Kontron/FutureData. Despite what my adversary says. My experience writing real world compilers are my credentials.

I am not all that concerned with the grammar link on the metacompiler page. But what it does show is my adversary's lack of understanding of computer science terminology. He just can't seam to grasp what I am talking about. I discovered the problem when doing my proposed change. I check every link I used. My proposed change avoids that problem. Anyway here is the problem with my adversary use of grammar.

A metacompiler is defined by a set of grammar ... The actual grammar link is to: "Formal grammar"

Follow the grammar link above to Formal grammar where it is described as a production grammar. My adversary is so myopic he fails to understand my objection to that link. There are many compilers that do not use a generative grammar that a Formal grammar is defined as. The Schorre line of meta compilers use an Analytic grammars. I would say that they were PEGs. Which they do fit into. Analytic grammars works as well. There are top down analytic rules defining the legal syntax constructs of the language. In the Schorre line of metacompilers there is a main driving syntax rule. Like main in C and c++. In Schorre's compilers there was a statement that specified the driving rule. ".SYNTAX PROGRAM" specified PROGRAM to be the driving rule.

I copy the definition here so my advisory might better understand what I was talking about. He misconstrues why I had to resort to that. But just for referance here it is again.

In theoretical linguistics, a generative grammar refers to a particular approach to the study of syntax. A generative grammar of a language attempts to give a set of rules that will correctly predict which combinations of words will form grammatical sentences. A formal grammar is a set of rules for rewriting strings, along with a "start symbol" from which rewriting starts. Therefore, a grammar is usually thought of as a language generator.

Now my adversary claims

The person filing the RFC, Steamerandy, first started editing this topic 3 weeks ago, and his additions were accepted, even though they appeared to be erroneous. Unfortunately the documentation to challenge his changes is now classified/secret by the US government and therefor unavailable -- seriously, I'm not making this up, CWIC became a classified project of SDC, System Development Corporation, a research company in Santa Monica that did a lot of work for the military. From Steamerandy's own Talk page:
"CWIC was developed at SDC a government think tank. It was classified by the government some time in the early 70s. But I got my manual from Erwin at an ACM meeting before it was classified."

True except for the fact there is the ACM publication that describes CWIC that can be gotten from the ACM archives."CWIC"

So he made changes based on a manual none of the rest of us have access to, but hey, additional information on this topic is always appreciated.
Note that the web page linked is titled "CWIC: continuous web image collector", which appears to be "a system that automatically traverses selected Web sites collecting and analyzing images" and presumably has nothing to do with compilers or metacompilers. 46.13.191.93 (talk) 00:18, 31 August 2015 (UTC)[reply]
Don't know how that CWIC link was wrong. "The CWIC/36O system, a compiler for writing and implementing compilers" Steamerandy (talk) 01:39, 1 September 2015 (UTC)[reply]

This is a lie. The "CWIC" reference was given. Maybe he already knew the ACM document contained a meta compiler definition contrary to his..

In its most general form a metacompiler is a program, written for a machine M which will accept specifications for a Programming language Li; and its equivalent in the language of Machine Mi, and produce a compiler which runs on Machine M. Source Programs which are the input to this compiler are written in Language Li. The output of this compiler is object language which runs on machine Mi.
For a reason I posted a few lines above, it appears unlikely that the document linked contains any such definition. 46.13.191.93 (talk) 00:18, 31 August 2015 (UTC)[reply]
The CWIC/360 document does. You might have checked the CWIC reference in the metacompiler that is correct. Seams the ACM document id was wrong. Steamerandy (talk) 01:46, 1 September 2015 (UTC)[reply]

OMG!! there is no mention of a metacompiler compiling it's self.


How is it that Damon Simms can go muck-up the TREE-META page, admitting knowing little about TREE-META, change all instances of Unparsing to Nonparsing. When unparse is from the TREEMETA manual. Unparse rules disassemble the parser tree thus unparsing the parse tree. FROM the TREEMETA manual the unparse syntax:

28 Unparse Expressions
28A Syntax
2RB
28Al outexp = subout ('/ outexp / .empty);
28A2 subout = outt (rest / .empty) / rest;
28A3 rest = outt (rest / .empty) / gen (rest / .empty);
28A4 outt = .id '[ arglst '] I '( outexp ') I nsimpl (': (S / L / N / C) / .empty);
28A5 arglst = argmnt (', arglst / .empty) / .empty;
28A6 argmnt = nsimp / '.pound .num;
28A7 nsimpl = 't nsimp I nsimp;
28A8 nsimp = '* .num (': nsimp / .empty);
28A9 genl = (out / comm) (genl / .empty);
28A10 gen = comm / genu / '< / '> ;

"TREEMETA Manual"

It's obvious Damon is blowing smoke. One should doubt anything he says. How is that Damon Simms can change the TREEMETA topic to disagree with termonolgy from the TREEMETA manual? Was it out of ignorance or for some agenda he has? Damon Simms vandalizes the TREE-META topic and then threatens me. "if you try changing this page to your limited wrong myopic views, we will consider it to be vandalism."

Actually check it out it's here on the talk page. I need to find out how to report someone for vandalism. People do make mistakes. Best for now to figure Damon Simms made that TREE-META change out of ignorance.

Exactly what changes was Damon talking about? The ones I made to the CWIC example that was already there I gave a reference to the CWIC ACM paper. There is little if any information about CWIC I have put on the metacompiler page that isn't in that paper. It talks about the generator language being able to manipulate trees before generating code. It has an example of a simple interpreter demo written entirely in CWIC. I changed the CWIC example to use the unparse rules exactly as illustrated in the interpreter example in the ACM document. Well not quite I didn't use vectors. True it is not stated that the CWIC generator is LISP 2. Others at the ACM meeting recognized it and Erwin acknowledged it was. In the TREEMETA document I found some metacompiler history explaining that at CDC BOOK4 and BOOK5 were metacompiler written in LISP 2. In fact TREMETA document has a wealth of information on early metacommpilers.

A CWIC Example

This example is directly form the public available ACM paper on CWIC ($15 to non-members, $10 to members, and $5 student members).[4]

Figure IV[4]

.SYNTAX
PR0GRAM = $(ST | DECLARATI0N) '.END' ;
DECLARATI0N = ID '=' NUM ';' :EQU!2 DECL[*1];
ST = ID ':=' EXP ';' :STORE!2 C0MP1LE[ *1];
EXP = TERM $('+' TERM :ADD!2);
TERM = FACTOR $('*' FACTOR :MPY!2);
FACTOR = ID / '(' EXP ')' /NUM;
LET: 'A'/ 'B'/ 'C'/ 'D'/ 'E'/ 'F'/ 'G'/ 'H'/ 'I'/ 'J'/ 'K'/ 'L'/ 'M'/
     'N'/ 'O'/ 'P'/ 'Q'/ 'R'/ 'S'/ 'T'/ 'U'/ 'V'/ 'W'/ 'X'/ 'Y'/ 'Z';
DGT: '0'/ '1'/ '2'/ '3'/ '4'/ '5'/ '6'/ '7'/ '8'/ '9';
ALPHNUM: LET / DGT;
ID .. LET $ALPHNUM;
NUM .. DGT $DGT MAKENUMBER[];
.F1N15H
.STOP SETUP PROGRAM

Figure V[4]

.GENERATOR
DECL(EQU[X, Y]) => DEF:(X) := Y
COMPILE(STORE[X, Y]) => DEF:(X) := EVAL(Y); PR1NT(DEF:(X))
EVAL(IDP(X)) => DEF:(X)
(NUMBER(X)) => X
(#V1[EVAL(X), EVAL(Y)]) => #U1
#V =ADD, MPY
#U = X + Y, X * Y
.F1N15H
.STOP SETUP PROGRAM

Note I didn't talk about the vectored (#V1) pattern matching used above. Where EVAL matches the ADD and MPY nodes using CWIC vector operator #. It is obvious what the vectors do. Right! Note: CWIC specifies the driving rule with the ".STOP SETUP PROGRAM". SETUP is a MOL-360 function, called first before calling PROGRAM.

One of the greatest things about these meta languages is the readability and ease in which they can be extended. And maybe should be one of the defining factors. The above example only handled addition and multiplication. What would it take to extend it to handle subtraction and division?

We would change EXP and TERM syntax rules to include the new operators and generate nodes for them:

EXP = TERM $(('+':ADD / '-':SUB) TERM !2);
TERM = FACTOR $(('*':MPY / '/':DIV) FACTOR !2);

How easy was that. Now we would need to change the vectors in the generator:

#V =ADD, SUB, MPY, DIV
#U = X + Y, X - Y, X * Y, X / Y

Almost anybody can understand what I did there. Hell I have spent gobs more time writing the explanation here then it took to change the program lines. It was said that these are recursive decent parser. Were is it in the above example? Does anybody see any recursion there? Don't peek.

In FACTOR you have '(' EXP ')'

In the Schorre META grammars recursion is available. Question. Is this really a recursive decent parser? If what you are parsing isn't a nested language construct there would be no reason for recursion.

To illustrate the power of a metacompiler I am using an example from developing a COBOL compiler in SLIC:

The COBOL RECORD CONTAINS clause Incident

There was an incident with COBOL compiler written in SLIC. The DEC-10 COBOL syntax was used so program testing could be done on the DEC-10. When first released a bug was discovered. As it turned out the DEC compiler had exactly the same bug. DEC had deviated from the language spec allowing a FILE CONTAINS clause to use RECORD where the spec required ('RECORDS' / .EMPTY) In the program the FILE CONTAINS clause was followed by a RECORD CONTAINS clause. The file contains clause took the RECORD of the record contains as being part of the file contains clause and not recognizing CONTAINS produced an error. Less then a 10 min fix. The file contains clause ended matching

('RECORDS' / 'RECORD' / .EMPTY)

was changed to

('RECORDS' / -RECORD_CONTAINS_CLAUSE 'RECORD' / .EMPTY)

Added a peek ahead that checked for a RECORD CONTAINS clause. All fixed. There was still the problem of the COBOL programmers having not tested their programs using the DEC COBOL compiler. But that is another story. What kind of a parser is this anyway? That was a quick fix to get the project going again. After checking to make sure that it was safe to look for CONTAINS it got changed to:

('RECORDS' / ('RECORD' -'CONTAINS' \ .EMPTY))

More details on the Run Time Library

I have given examples here on the talk page of code generation from the syntax language based on my personal knowledge. They illustrate how easy it is to hand compile the analytically syntax rules used in CWIC or TREEMETA into assembly code. The point being: It is not mind bending. In some of those examples _CmpStr was called. The total code involved in that _CmpStr is two pages or more of code. It's not just a simple compare char string function. The meta compiler did not compile the run-time code. So really a metacompiler does not compile it's self into an executable program. There is many more lines of code in the compiler run time library than in the metalanguage source of the metacompiler. I downloaded the ACM CWIC paper. And learned some things about CWIC I did not know. One is that the SYNTAX language compiler was a separate compiler form the generator compiler. SLIC was a single compiler handling all of it's five language forms. The cc compiler is the same. But as I have explained. I no longer consider them to be separate languages. They are no more separate languages then a class declaration is a separate language from C++. An analogy is that syntax is the input description like a record definition is to COBOL. A lot more complicated but still it an input definition.

This is from a metacompiler I am working on. Basically an updated implementation of SLIC. For the time being I am calling it cc. I am using naked C++ void functions as assembly function containers. The point here is the amount of code not compiled by the compiler compiler. The parsing and code generation is compiled. But like most languages there is a large amount of support code that gets combined to make an executable program. This illustrates to some extent the amount of overhead code that is part of a real meta compiler and in this some complicated stack manipulation for a novice. Damon will probably say it doesn't work because he likely wont understand it. Admittedly exchanging a function return address and moving relative to the stack pointer is not commonly done. A majority of my programming has been in assembly. But the minor stack rearranging going here is trivial to the backtracking functions. Anyway this is to illustrate the amount of code involved with supporting a simple string match in the syntax language.

void __declspec(naked) function_name(){_asm{ <assembly language coded function> }}

__declspec(naked) removes all preamble and exit code from the function. There is no code in the function other then that coded in the function source. EAX in most cases contains the current input stream character. The following code is to illustrate the run-time support code needed for a simple string match in the syntax specification. (This is some mind-bending assembly code) A lot of stack manipulation going on. Fairly simple compared to re-entrant backtracking function. EBP is used as a backtrack pointer by the syntax rules. Basicly a nested longjmp into the backtracking support code.

The point is not the specific code. It's the amount and complexity of the code. The metacompler generated code is a very small part of the compiler code. The code here is specifically involved in matching a string against the input.

The definition of CHARCLASS from CCclass.h

#define CHARCLASS  byte ptr [eax+classmap]

classmap is a table indexed by a character code. Bits at the indexed location indicate a characters class membership. Class rules assign the bits used for a class and generate table entries.

#include "common.h"
#include "CCclass.h"
#include "CCdata.h"

extern void __savePntrs();
extern void __rstrPntrs();
extern void _Advance();
extern BYTE TokenFlg;

/******************************************************************************
*
*  Match a "<string>"
*  push	
*  ecs -> "<string>"
*
*/
void __declspec(naked) _CmpStr(){_asm{  //_CmpStr = "..."
	call	__savePntrs             // returns ZF and ecx = char* "..."
	je	l2			// ZF = NOT skipping skipclass
	cmp	byte ptr[ecx],0		// if (end of string)
	jne	l4			//   then ir's a match
	jmp	l2			// in token match not skipping skipclass

l1:	test	CHARCLASS,skipclass	// source stream a skipclass char?
	je	l5			// -skipclass failure if not null string 
	call	_Advance			// advance over skip class chars
	jne	l6			// unable to advance is failure
l2:	cmp	al,byte ptr [ecx]	// source char match string char
	jne	l1			// if not matched keep looking
//	   ******* matched first character rest of string must now match
l3:	cmp	byte ptr[ecx],0		// if (end of string)
	je	l7			//   then ir's a match
	inc	ecx			// inc string pointer
	call	_Advance		// advance input stream next char
	jne	l5			// unable to advance is not success
l4:	cmp	al,byte ptr [ecx]	// source char to string char
	je	l3			// jump of mach
l5:	cmp	byte ptr[ecx],0		// if (null string)
	je	l7			//   then ir's a match
l6:	cmp	esp,0			//  return ne failure  
	jmp	__rstrPntrs		//  got here ne failure

l7:	test	TokenFlg,Token_Going_On	// In a token rule?
	je	l8			// If YES --
	or	TokenFlg,TokenWhiteSkip	// Stop skipclass skipping
l8:	pop	ecx			//  restore user ecx
	cmp	al,al			// return ==
	ret				// return match eq condign
}}

There are four string test types similar to the above _CmpStr for "<string>" other string compares are +"<string>", -"<string>" and ?"<string>" All identical to CWIC functions execpt SLIC can match skip_class characters where CWIC first skips skip_class characters, SLIC skips skip_class character while searching for the first character. A skip_class character may be explicitly matched by SLIC and cc. This for example could to match a line break before a symbol. Assembly labels often are anchored to the front of a line.

/****************************** _Toknadv *******************************\
*									*
*		       THE SIMPLEST FUNCTION HERE			*
*									*
*	Puts a matched character on the token in the token buffer	*
*	then advances the input stream jumping ti _Advance		*
*									*
\************************************************************************/

__declspec(naked) void _Toknadv() {_asm {
	xchg  ecx,dword ptr tokenptr       // get tokenptr
	mov   byte ptr [ecx],al            // put character in TokenBuf
	inc   ecx                          // Advanced tokenptr
	mov   byte ptr [ecx],0             // clean display when debuging
	xchg  ecx,dword ptr tokenptr       // get tokenptr
	or    TokenFlg,TokenWhiteSkip      // disable skip_ch ski ping
	jmp   _Advance // Advance input pointer, At end of input return null

//	_Advance is parse stream input function
}}


/************** Set up functions for cc matching function **************\
*									*
*  cc matching function are called by token making and grammar rules to *
*  match strings and characters in the input stream.			*
*									*
*     !! This s assembly code in a C++ wrapper !!!!			*
*									*
*  __savePntrs:								*
*  									*
*  __savePntrs is called on entry by string match function to save the 	*
*  input stream state required to Back Track a failed match. Matching 	*
*  functions are not re-entrant. States do not need to be stacked.	*
*  Objects are not put on the parse stack or node stack by them. Back	*
*  Tracking is only done when they do not succeed. Only the input	*
   stream state is saved statically.					*
*  									*
*  On success a match function simply returns success. On failure it 	*
*  must restore the input stream state. To restore the input stream 	*
*  it simply jumps to __rstrPntrs. 					*
*  									*
*	jmp	__rstrPntrs						*
*									*
*  __rstrPntrs:								*
*  									*
*  __rstrPntrs is always jumped to on failure by a matching		*
*  function and only restores the input stream state			*
*									*
*  The tokening skipclass is also affected by matching functions When	*
*  called from a token rule a match will set the partial match flag	*
*  that prevents further skipclass skipping. The flag may only be set 	*
*  when a token rule is active.						*
*									*
*  The input stream is a series of files. An FCB (file control block)	* 
*  is used to track files. FCBs are linked in order as given on the 	*
*  command line. An FCB basically manages file buffers. File buffers are *
*  also linked in the order filled. In put file buffers are read from 	*
*  files as the parse rules progress through the input. The FCB status 	*
*  keeps the file state. Opened and end of file reached. Error flags 	*
*  generate a compile error. The stream position is a set of pointers. 	*
*  The data pointer points at the current input position in a file 	*
*  buffer. The BCB pointer points at the current file buffer. The data 	*
*  count holds the number bytes processed. It is needed so as not to 	*
*  re-output characters to the print stream. 				*
*									*
*  The token and input stream saved state are as follows:		*
*									*
\***********************************************************************/


//__CC_FCBDCB mark0;		// saved input steam state
 __CC_DBCH*	markbufr;	// buffer header pointer
 char*		markdataptr;// data pointer
 short		markdatacnt;// bytes remaining in buffer;
 char*		marktokenptr;

BYTE   token_stats;
int stream_inputcnt;
char* strng_parm;

__declspec(naked) void __savePntrs(){_asm{  // crazy stack manipulation!!

/***********************************************************************\
*   __savePntrs saves input stream state for string match function. 	*
*									*
*	push	offset <string address>					*
*	call	_match_function_					*
*    <return from _matching function_>					*
*									*
*__declspec(naked) void _match_function_() {_asm{			*
*	call	__savePntrs						*
*    <return from __savePntrs>						*
*									*
* stack on entry:							*
* [esp+8]	_matching function_'s <string address>			*
* [esp+4]	<return from _matching function_>			*
* [esp+0]	<return from __savePntrs> to _matching function_ 	*
*									*
*     rearrange stack: Stack on entry as above				*
*									*
*    preserving the calling rule's ecx					*
\***********************************************************************/

	xchg	ecx,[esp+4]     // swap ecx for _matching function_'s return
	xchg	ecx,[esp+8]     // swap return for char* str

/***********************************************************************\
* [esp+8]	<return from _matching function_>			*
* [esp+4]	callers saved ecx					*
* [esp+0]	<return from __savePntrs> to _matching function_ 	*
*   ecx		_matching function_'s <string address>			*
*									*
* Stack now as above with ecx *string parameter from [esp+8] on entry	*
*									*
\***********************************************************************/

	mov	strng_parm,ecx		// string parameter saved
	mov	ecx,InPutFCB		// point ecx at FCB

	mov	eax,FCB_bufr(ecx)	// (+2) save buffer pointer
	mov	markbufr,eax

	mov	ax,FCB_datacnt(ecx)	// save data count
	mov	markdatacnt,ax

	mov	eax,FCB_dataptr(ecx)	//  
	mov	markdataptr,eax	// 

	mov	eax,__inputcnt
	mov	stream_inputcnt,eax

//	mov	al,FCB_status(ecx)	//  
//	mov	markstatus,al	// 

	mov	al,TokenFlg		// save current flags state
	mov	token_stats,al		// so can be restored on fail

//  Got TokenFlg skipclass state. Matching can not skip skipclass once any 
//  characters have been matched. That state is passed back to the matching 
//  function in the zero status flag by the following test instruction.

	test	al,TokenWhiteSkip	// z flag = state of WhitSkip

	mov	eax,tokenptr		// Save Token pointer
	mov	marktokenptr,eax	// after TokenWhiteSkip

	mov	eax,FCB_dataptr(ecx)		// 
	movzx	eax,byte ptr [eax]	// get current BYTE

	mov	ecx,strng_parm		// point ecx at string parameter
	ret

/***********************************************************************\
*									*
*    On return to caller:						*
*									*
* [esp+4]      original <return from _matching function_>		*
* [esp+0]      callers saved ecx   <must be popped by function>		*
*   ecx        points at string to match <string address>		*
*   NOTE.  __rstrPntrs  will restore callers ecx. Do not pop ecx	*
*									*
*   ecx only need be popped on success before a return.			*
*	pop	ecx		// needs restoring			*
*	ret			// before a successful return		*
*									*
* NO POP ECX before:							*
*	jmp	__rstrPntrs	// ecx will be restored by __rstrPntrs 	*
*									*
*    Input stream state saved.						*
\***********************************************************************/
}}


// **************************************************************
// *********  NEVER EVER CALL THIS FUNCTION EVER NEVER  *********
//
__declspec(naked) void __rstrPntrs(){_asm{ // match string reset input stream.

//   !*!*!*!*!*!*!*!*!  COMMON FAIL EXIT CODE  !*!*!*!*!*!*!*!*!
//   !*!*!*!*!*!*!*!*!*!*!*!*!*!*!*!*!*!*!*!*!*!*!*!*!*!*!*!*!*!
//   !*!*!*!*!*!*!*!*!*!*!*!*!*!*!*!*!*!*!*!*!*!*!*!*!*!*!*!*!*!
//   !*!*!*!*!*!*!*!*!*!*!*!*!  NOTE!  !*!*!*!*!*!*!*!*!*!*!*!*!
//   !*!*!*!*!*!*!*!*!*!*!*!*!  NOTE!  !*!*!*!*!*!*!*!*!*!*!*!*!
//   !*!*!*!  __rstrPntrs is always jumped to with saved !*!*!*!
//   !*!*!*!  ecx left on the stack to restore on exit.  !*!*!*!
//   !*!*!*!       THIS FUNCTION IS ALWAYS JMPed to      !*!*!*!
//   !*!*!*!      NEVER EVER !! CALLED !!  NEVER EVER    !*!*!*!
//   !*!*!*!*!*!*!*!*!*!*!*!*!*!*!*!*!*!*!*!*!*!*!*!*!*!*!*!*!*!
//   !*!*!*!*!*!*!*!*!*!*!*!*!*!*!*!*!*!*!*!*!*!*!*!*!*!*!*!*!*!

	mov	eax,stream_inputcnt
	mov	__inputcnt,eax

	mov	eax,marktokenptr		// Restore Token pointer
	mov	tokenptr,eax

	mov   	al,token_stats
	mov   	TokenFlg,al                       // State restore includes token flags

	mov	ecx,InPutFCB		// point ecx at FCB

	mov	eax,markbufr
	mov	FCB_bufr(ecx),eax	// (+2) save buffer pointer

	mov	ax,markdatacnt
	mov	FCB_datacnt(ecx),ax	// save data count

	mov	eax,markdataptr	// 
	mov	FCB_dataptr(ecx),eax	//  

	movzx	eax,byte ptr [ecx]
	pop	ecx
	ret
}}

The following is the integer token rule. Note there was token rules in the CWIC example from the CWIC ACM paper. TOKEN rules are defined by a .. operator. CWIC character class rules define the character classes used in the interger rule.

bin:         '0'|'1';

oct:         bin | '2' | '3' | '4' | '5' | '6' | '7';

dgt:         oct | '8' | '9';

hex:         dgt | 'A' | 'B' | 'C' | 'D' | 'E' | 'F'
                 | 'a' | 'b' | 'c' | 'd' | 'e' | 'f';

And the hand compiled integer token rule: (note the integer rules is expressed in comments) Deviation from CWIC and SLIC which only allowed a single conversion call in a token rule. cc allow conversion call to terminate an outer most alternate..

void integer();      // ..  ("0b"|"0B")           bin $bin  MAKBIN[]  // binary number
                     //    |("0o"|"0O")           oct $oct  MAKOCT[]  // octal number
                     //    |("0x"|"0X"|"0h"|"0H") hex $hex  MAKHEX[]  // hex number
                     //    |                      dgt $dgt  MAKINT[]; // decmal number

char _str_0B[] = "0B";
char _str_0b[] = "0b";
char _str_0O[] = "0O";
char _str_0o[] = "0o";
char _str_0H[] = "0H";
char _str_0h[] = "0h";
char _str_0X[] = "0X";
char _str_0x[] = "0x";


__declspec(naked) void integer() {  _asm {
	call	_TokenEntry

//   ("0b"|"0B")           bin bin* MAKEBIN()

	push	offset _str_0B		// "0b"
	call	_CmpStr
	je	l1
	push	offset _str_0b		// "0B"
	call	_CmpStr
	jne	l5

l1:	test	CHARCLASS,bin		// ('0' | '1')
	jne	l3
	cmp	esp,0
	ret
l2:	test	CHARCLASS,bin		// ('0' | '1')
	je	l4
l3:	call	_Toknadv
	jmp	l2
l4:	call	MAKEBIN
	ret

l5:	push	offset _str_0O             // "0O"
	call	_CmpStr
	je	OCT
	push	offset _str_0o             // "0o"
	call	_CmpStr
	je	OCT

	push	offset _str_0H             // "0H"
	call	_CmpStr
	je	HEX
	push	offset _str_0h             // "0h"
	call	_CmpStr
	je	HEX
	push	offset _str_0X             // "0X"
	call	_CmpStr
	je	HEX
	push	offset _str_0x             // "0x"
	call	_CmpStr
	je	HEX

l6:	test	CHARCLASS,dgt             // Looking for a digit.
	jne	INTG                      // If it is a dgt goto INTG
	test	CHARCLASS,skipclass       // Not a dgt test if skipclass 
	je	l7                        // jump is not skipclass
	call	_Advance                  // advance over skip class
	jmp	l6                        // loop looking for first dgt
l7:	cmp	esp,0                     // failure return NE status
	ret

o1:	call	_Toknadv
OCT:	test	CHARCLASS,oct
	jne	o1
	jmp	MAKEOCT
	
h1:	call	_Toknadv
HEX:	test	CHARCLASS,hex
	jne	h1
	jmp	MAKEHEX

INTG:	call	_Toknadv                // matched a digit copy to token buffer.
	test	CHARCLASS,dgt           // is it a dgt?
	jne	INTG                    // loop until not a dgt
	jmp	MAKEINT                 // amd then make a numeric object.
}}

This is not mind bending. You simply start hand compiling the source into code. In the process you figure out how the code generator needs to work. This hand compiling step is a learning process. I am not even trying to write optimized code. If I were I would recognize that the first character of every alternate is not a skip class character and skip skip skipclass characters before trying to match the first token character. _TokenEntry intercepts the return of the rule calling it. On failure it restores the input stream state. On success it creates a symbol if an interceding conversion function has not been called. MAKEINT, MAKEHEX, etx are conversion function that intercede the making of a symbol.

The following is the cc grammar rules for a token. The main driver rules has already been posted here. Including the rule that matched the .. operator and calls the token rule.

str_const  .. '"' $(~'"' | """""" ,'"') '"'  MAKSTR();

token        =   chr_pattern (maker:INTERCEPT!2 ('|' token:ALT!2 |--) |--);  // INTERCEPT conversion call allowed in outer alternates.
chr_pattern  =  +[basic $ basic]+;
basic        = '$' basic :$!1 |'(' tokenxpr ')' | str_const | char_match | insert; 
tokenxpr     =  chr_pattern ('|' (+"--" | tokenxpr) :ALT!2 | -- );
char_match   =  ('+':RETAIN|'-':NEG|'~':NOT) (str_const | chr_const)!1 | char_test;
char_test   ==  +"any" | +"ANY" | (chr_const | id) ('*' :$!1 | --);
insert       =  ',':INSERT (chr_const | str_const)!1;
maker       ==  procid '(' +[ (m_arg $(',' m_arg)) |-- ]+ ')' :CALL!2;  //NOTE procid tested for the ( in the token rule. 
m_arg        =  str_const | integer | character | id;
procid     .. alpha symchr* ?'(';   // skipclass skiping is off when peeking for ( following the id. No white space allowed. 

Experimenting with the * repeat operator of more resent meta languages.

META II compiled to interpreted code. The interpreter was in 1401 assembly and had many more times the number of lines of code then the metalanguage source. That wasn't automagicly generated. The interpreter was coded by hand.

The point of the above is to show the amount of run-time code compared to the metalanguage source code. All of the parsed or matched entities are allocated objects. The objects created sense a point of backtrack are released on a backtrack failure. Backtracking in the functions above are single element backtracking involving only the input stream. Token matching and string matching are similar only restoring the input stream state on a failure. Objects are only created on success.

The metacompiler this is from is designed to generate binary code for any target machine architecture. So it has a bit more of a run-time library. Numeric values are indefinite precision objects. For example the target processor could be a 32 bit machine and the compiler is running on a 16 bit machine. Multiprecision arithmetic library is required.

More adversary bickering

My adversary Damon wrote:

  • Steamerandy claims this article should be changed because statements in the article are wrong

There are two things wrong.

1. "The feature that sets a metacompiler apart from a standard compiler-compiler is that a metacompiler is written in its own language and translates itself."

Well the fact is that not all metacompilers compilers are written in their own language. In fact Damon Simms supplied site references that were counter to his opinion that A metacompiler is written in their own language. And this is what Damon Simms calls misconstrues his referances.

2. Is a minor point in a link to formal grammar. I have explained above. It would not be a problem with my suggested change.

  • Damon: I have provided multiple references to counter his claims, which he ignores or misconstrues

Damon gave several references to sites where metacompilers were written in PL/1 or assembly that proved my point.

Damon gave a number of references to so called forth metacompilers. On reading and recearching I found that the forth metacompile not to be a metacompiler at all. forth freaks consider forth a meta compiler just because it compiles it's self. FROM WWW.forthos.org:

When forth is coded in Forth, the process of building a new forth system is called metacompilation. Like most metacompilers, ForthOS's has restrictions on what kinds of forth code are permissable. Forth (programming language)#Self-compilation and cross compilation www.forthos.org

That says it all right there. All references to a forth should be removed. FORTH IS NOT A METACOMPILER!!!!

Damon gave a referance to a blog artical. It was written after the metacompiler topic was first posted by Damon. So in all probability just repeating what was said here.

  • He wants to change the article to a more narrow explicit focus that will be about one instance of the technology, his favorite

WRONG AGAIN. I am most interested in getting rid of this ignorant idea that a meta compiler must compile it's self. It is unfounded in real computer science literature. FORTH is a dead fringe language. Bury it where the sun don't shine!! The only legit definition of metacompiler in the ACM archive is from 1970 given above. That is the first and only definition in ACM publications. Do you not recognize the ACM?

  • He doesn't even like the article topic, disputing that it is a real topic

That is a lie. I like the topic. The only reason for Damon to lie is that he are wrong and blowing smoke tring to make me look bad. That comment was made about Damons ignorant link to formal grammar. That one link would eliminate most of the existing metacompilers. The statement was that if he keeps that reference there are few if any metacompilers left. It eliminates all of Schorre's metacompilers. They all use analytically grammar. He obviously does not understand the difference between the types of grammars. He thought I was talking code production rules. Really. He made a change to code production. It's here. posted by him. Does Damon even have a degree?

  • He has even stated the whole article should be deleted based on his insistence that other metacompilers don't exist

Again a lie. This has to do with his ignorance of that a formal grammar is a generative/production grammar. What is an article with no subjects? And sense Damon Simms think he is so right about grammar and insist on referencing formal grammar claiming that is the grammar used by all metacompilers. Damon Simms has actually eliminated probably all of them. So nothing fits Damon Simms narrow definition. Damon thinks a production grammar is about producing code. Well you better go back to school Damon. A production grammar is one that generates valid strings of a language. NOTE the action here is generates valid strings. It does not analyze a language construct for correctness. Just the opposite. BNF is a production grammar. I have tried to explain this to Damon to no avail. He just can not grasp this most basic concept. I do not see how he got a computer science degree.

  • He cobbles together bits and pieces from related articles on computer theory and research to make his point, but often they don't make sense, at least not to me (hey, what do I know, I just have 2 degrees and 40 years experience in this stuff)

It doesn't show. Does it?? I mean his claimed Knowledge and expertise.

Damon Simms can not comprehend the difference between a generative grammar and an analytic grammar. Damon Simms is awe struck by the normal operation of bootstrapping. Damon Simms Has misused the definition of bootstrapping. Showing his lack of computer science terminology expertise. Which he so boastfully claimed to have. Damon Simms post links to sites that prove my point and like a little child tries to blame me for using his reference against him. Says I misconstrue is references. I do not see how I am in any way responsible for his posts supporting my arguments.

Apparently Damon Simms knows how to flap his jaws showing what he does not know. That is a FORTH FREAK characteristic is it not? One would think that such a learned individual should be able to navigate the web to check out the references they gave. A computer scientist should know about the grammar types should they not? An experienced programmer who has worked on compilers should not find the initial coding/bootstrap process mind bending. See above. The student who became a professor.

  • When provided with references that counter his claims -- such as the existence of other metacompilers, or examples of the use of "metacompiler" as an accepted concept that exists apart from his own narrow definition -- he ignores or muddles the evidence, or explains them away with muddled logic

Forth is not a metacompiler. Damon Simms is just trying to weasel forth into being acknowledged as a metacompiler. But seams to not know that forth is not a formal grammar.

When forth is coded in Forth, the process of building a new forth system is called metacompilation. Like most metacompilers, ForthOS's has restrictions on what kinds of Forth code are permissable. Forth (programming language)#Self-compilation and cross compilation www.forthos.org So Damon must be a forth freak believing that forth is a metacompiler.

Damon I am very sorry to inform you that forth is dead. Quit digging forth references up. Leave them buried.

If forth is a metacompiler then so is every other compiler written in it's own language. PASCAL, C, C++ etc. There was a FORTRAN compiler writen in FORTRAN. Lets get LISP into the club.

  • He keeps trying to show that key points in the article are wrong, but as far as I can tell it's based on his misreading of the info and a failure to keep up in this topic over the years, except for his own little narrow slice of the technology -- I have encountered his type before, if he doesn't know about something, then it must not exist, even feeling he can make fun of anyone associated with it

OMG!! 4COL WABOC!!

The situation seams just the opposite. Damon thinks the norm is mind-bending. It is mind-bending to him that some one can code a compiler from scratch. A newbie reaction to something they have not seen before. I have a vary different back ground than Damon, having worked on operatoring systems and compilers for most of the 49 years I have been programming.

Damon is a forth freak who tries to run down any one that disagrees with him. That is all he has done here. Try to run me down. What arguments has Damon put forth to my actual proposed changes? Oh thats right he put forth forth!!! Or should I say forth first. Damon says I have degrees so I am the authority here. Look at his response to the other fellow who posted about his outlandish ideas of bootstrapping being mind-bending. look at his lack of understanding other computer science terms that I have already pointed out.

Damon says he has experience bootstrapping. If it was such a mind-bending experience. Will that just says it all. He isn't much of a programmer. This mind-bending experience just shows what level of a programmer he is. Writing code from scratch is mind-bending to Damon Simms. A level 1 NB programmer. Give them simple modification tasks.

  • He keeps posting his own technical history all over Wikipedia, on the Talk pages of related articles -- I assume as proof of his authority in this subject, but I'm not sure and it does make me wonder.

It has become apparent that Damon is no authority on metacompilers. Forth is no more a metacompiler then is LISP. They are simply extensible languages. By the way I learned forth a long long time ago. And have forgotten most of it.

Damon Simms explains Schorre naming his metacompiiler

The first two metacompilers made by Schorre were called "Meta" -- Meta-I and Meta-II. I believe Schorre named his metacompiiler "Meta" after the term "metacompiler", just as Microsoft named their word processing program "Word". It seems Steamerandy believes the opposite, that "metacompiler" is named after those first programs of Schorre and there is nothing special about the term "metacompiler". He completely denies all the evidence I have produced backing up the idea that "metacompiler" is a basic concept long accepted in Computer Science and Compiler Development and Software Engineering.

Damon Simms doesn't even know computer science history. We can look back to early books from the late nineteen fifties and sixties and find references to metalanguages. metacompiler shows up in the sixties. And guess where. If there is an earlier meta compiler before META I. What, when and were was it documented. The TREEMETA document says that META I was the first metacompiler. "Tree_Meta for the XDS_940"

Val Schorre explains META II was meant as a metalanguage compiler

From the early META II document written by D. Val Schorre at UCLA 1963:

As mentioned earlier, META II is not presented as a standard language, but as a point of departure from which a user may develop his own META language. The term "META Language," with "META" in capital letters, is used to denote any compiler writing language so developed."META II D. V. Schorre UCLA Computing Facility"

Damon Simms really does not understand metacompiler history. As Val Schorre explains META II was meant as a metalanguage compiler, A base to develop further metalanguage and metacompiles.

When forth is coded in forth, the process of building a new forth system is called metacompilation ?

When forth is coded in forth it is a self-hosting compiler. C++ is a self-hosting compiler as are ALGOL, COBOL, PASCAL, C, MODULA II etc etc. Meta meaning one level above would describe A programming language description (A metalanguage that is a a level above the language) compiled by a metacompiler the result is a compiler for the language described.

Damon Simms Stated:

For example, his first claim from his RFC request above, which he seems to repeat over and over, even after I have presented counter-evidence:
the description "The feature that sets a metacompiler apart from a standard compiler-compiler is that a metacompiler is written in its own language and translates itself" is in question
I believe the quotes from that forth webpage above satisfies that complaint. It clearly states that a metacompiler "is a compiler which processes its own source code, resulting in an executable version of itself".

Damon Simms claims that a self-hosting compiler is a metacompiler because some Forth programmers say self-hosting compiler (FORTH) is a metacompiler.

Furthermore, it also disputes his definition of the prefix "meta-", which he keeps claiming over and over that we got wrong here, thus nullifying our use of the term "metacompiler". The forth reference also shows that the forth community has its own line of metacompilers, satisfying his repeated request to show any other example of metacompilers besides the ones he's pushing. Despite providing this evidence, Steamerandy continues to deny and obfuscate this obvious data countering his claim.

Click on the words forth above and see what Wikipedia forth TOPIC has to say about it. I wonder why Damon did not link to the Forth (programming language) topic here in Wikipedia? (It proves him wrong) There is no forth metacompiler. Damon why don't you go argue with the forth contributors.

So lets see here, Damon Simms said: I believe the quotes from that forth webpage above satisfies that complaint. It clearly states that a metacompiler "is a compiler which processes its own source code, resulting in an executable version of itself". They are talking about forth it's self.

I have already went over this. I misconstrued this by actually digging into the compiler they were talking about and found:

* When forth is coded in Forth, the process of building a new forth system is called metacompilation. Like most metacompilers, ForthOS's has restrictions on what kinds of forth code are permissable. Forth (programming language)#Self-compilation and cross compilation www.forthos.org Journal of FORTH Application and Research, Volume 4 Issue 2, June 1986 Pages 257 - 258

He posted numerous forth sites. Ultimately all turned out to be forth freaks claiming forth to be a meta compiler. The question is why did he not reference the Wikipedia forth topic?

So Damon Simms is claiming that any compiler that compiles it's self is a meta compiler. That means that C and C++ are metacompilers just like forth acording to Damon Simms. posted links. Oh wait PASCAL is metacompiler. And FORTRAN. Damon Simms claims FORTH to be a meta compiler.

Damon Simms says he wants to teach metacompilers. I wonder does intend to teach FORTH as a metacompiler? Damon Simms provided links to a site of a metacompiler written in PL/1. And then cries I misconstrue his references. What! Because I read them?

There were 2 early metacompilers written at SDC. Book1 and Book2 were written in LISP 2. I assume by Erwin Book. The early history is documented in "Tree_Meta for the XDS_940"

It is a fact that extensible language such as LISP and FORTH shear some of the properties of a metacompiler. But they are not metacompilers.

Damon, Is it really your option that forth is a metacompiler?--Steamerandy (talk) 08:32, 17 October 2014 (UTC)--Steamerandy (talk) 00:11, 11 November 2014 (UTC)  [reply]

Unsubstantiated claim proved

To sum this up I have listed 6 meta compilers written in languages other then them selves. Absolutely proving wrong "The feature that sets a metacompiler apart from a standard compiler-compiler is that a metacompiler is written in its own language and translates itself." and that it is an unsubstantiated refuted claim and therefore should be removed.


--Steamerandy (talk) 23:14, 17 November 2014 (UTC)[reply]

More RFC Request for Consensus responses

I am here in response to the Request for Consensus. I am not sure where I am meant to give my view – but the above all seems to be an argument about whether there ought to be a request for consensus, so I'll give my view here. This article is, or ought to be, about what most programmers call "metacompilers". Some Forth programmers use the term "metacompiler" in an idiosyncratic way. This might be mentioned somewhere in the article, but certainly not in the first paragraph. Maproom (talk) 08:16, 10 November 2014 (UTC)[reply]


Rfc mediation

Summoned by bot. For the amount of text that was written here, the article on metacompilers isn't very well written regardless of the definition. But about this rfc: you can't expect other editors to read through this whole talk page, i might as well read a book on metacompilers. I don't know if I'm the first editor summoned, but i expect others were summoned but didn't even try. So i ask both authors to summarize your point of view in less than 500 words, so other editors can assist. Linguistically a metacompiler compiles compilers and a compiler that can compile itself would be an autocompiler. But common usage may have deviated, so both please summarise your arguments for the notability of either definition. Note that I'm no computer scientist, nor should i need to to understand your summary. And please, both of you, don't take this discussion personal. PizzaMan (♨♨) 07:50, 28 November 2014 (UTC)[reply]


References

  1. ^ a b A. Glennie (date July, 1960). On the Syntax Machine and the Construction of a Universal Compiler (Tech. Report No. 2 ed.). Carnegie Inst. of Tech. {{cite book}}: Check date values in: |date= (help)
  2. ^ a b Dewey, Val Schorre (1963). "A Syntax - Directed SMALGOL for the 1401,". ACM Natl. Conf., Denver,Colo.
  3. ^ Dewey, Val Schorre (1964). "META II a syntax-oriented compiler writing language". Proceedings of the 1964 19th ACM National Conference.
  4. ^ a b c d e Erwin Book; Dewey Val Schorre; Steven J. Sherman (June 1970). "The CWIC/36O system, a compiler for writing and implementing compilers". ACM SIGPLAN Notices. 5 (6): 11–29. doi:10.1145/954344.954345.
  5. ^ Neighbors, J.M. Software Construction using Components. Technical Report 160, Department of Information and Computer Sciences, University of California, Irvine, 1980.
Cite error: A list-defined reference named "FORD" is not used in the content (see the help page).


The discussion above is closed. Please do not modify it. Subsequent comments should be made on the appropriate discussion page. No further edits should be made to this discussion.