#and 2. xl's face at that particular part kills me
Explore tagged Tumblr posts
Text
just read the mess with lqq
and all i can think about is xlâs face in the manhua when he asks hc to spend the night
#my posts#animanga#liveblogging#because its a complicated m e s s#and xl was so against the entire thing being cleared up#TO PROTECT LANG QIAN QIU#and its just#im like holy fuck im s oglad hc exists#im so glad they found each other#IM SO GLAD HE DOESNT HAVE TO BE ALONE ANYMORE#and im also rerererereading the ox cart scene in the manhua#and fucking crying bc every panel of xl is like a stab in the heart#(he gave him the leaf LET ME DECOMPOSE)#and the entirety of ch 24 just hurts bc 1. san lang is so fUCKING BEAUTIFUL#and 2. xl's face at that particular part kills me#HE JUST LOOKS SO EARNEST AND GENTLE#AND I WANT HIM TO BE HAPPY FOREVER#and now hc goes and says 'just keep doing what you want to do'#AND ISNT THAT JUST IT#ISNT THAT WHAT I KEEP SCREAMING ABOUT#HOW THEYRE JUST OUT THERE DOING THEIR BEST TOGETHER#AND BEING ENDLESSLY SUPPORTIVE FO EACH OTHER#:((((((((#liveblogging tgcf
0 notes
Text
Review - Metroid: Samus Returns
Metroid: Samus Returns is another mediocre official Metroid release that shows that Nintendo just doesnât know what to do with the franchise. Fans of Metroid should temper expectations, newcomers should maybe look somewhere else.
Note: there will be some minor spoilers in this review.
Itâs impossible for me not to compare Samus Returns to last yearâs fan-made Another Metroid 2 Remake (AM2R). Both games reimagine a Game Boy only sequel to a popular NES game, but take drastically different approaches. AM2R was my favorite game of 2016. It captured the spirit of the genre and franchise so well, while also modernizing it in some smart ways. It was the best Metroid related thing Iâve played since Super Metroid. Itâs unfortunate to report that Nintendo official remake doesnât come close to achieving that.
Samus Returns felt like a lot of the other handheld Metroid games. Just how Fusion and Zero Mission suffered from the limitations from the GBAâs fewer buttons, Samus Returns faces similar issues. I played it on an original 3DS XL, and often ended sessions because my hands (particularly my left hand) hurt. The game requires you to hold the shoulder buttons, sometimes both at once, to aim and it felt so awkward. Using the circle pad to to both move and aim doesnât quite allow precision with either. Holding L to lock your position then aiming with the circle pad is never as precise as you want it to be. Even trying to shoot directly forward while moving is a chore. Firing missiles while holding R often ends up with them going slightly off course. This makes boss fights or some of the tighter exploration way more frustrating than enjoyable. Controls canât be changed either. There is no option to make it so that L and R are toggles instead of holds. In an ideal world, you would be able to move with the D-pad, aim with the circle pad, and use L to go through the new Aeion abilities. Well, in a real ideal world this game would be on a platform with a modern gampad, but regardless the developers did not utilize what they had well enough. Control customization is something that has been widely unavailable in the first party 3DS games that Iâve played and itâs never not a knock against them. Nintendo really needs to step of their game and allow players to rebind their controls. Itâs clearly not a priority for them and they need to change that. Especially considering that it was a feature in many first party SNES games.
Controls asides, Samus Returns is lackluster in other ways. Where in other Metroid games each area had its own unique appearance, thatâs not really the case for this game. While there are some really beautiful unique backgrounds and the occasional one-off set piece, all the rooms are built from the same few tile sets. Hereâs the purple caves, hereâs the ancient tech, etc. Music is bound to the tile sets too. Hot rooms always have a rearranged Lower Nofair theme, etc. This creates a feeling of sameyness. Itâs easy to forget what area youâre in when exploring for missed upgrades. Lack of enemy variety compounds this. You face maybe around five to six different enemy types throughout the game. There are upgraded versions that are colored slightly different, but the tactics donât generally change much. Maybe you use the Screw Attack against this version, or Powerbombs against this one, but generally once youâve seen an enemy you know how youâll be dealing with it for the rest of the game.
I have mixed feelings about the combat. They tried something new by giving you the ability to âparryâ certain incoming attacks. It works well enough and it utilized well in boss fights, but for the majority of encounters against regular enemies itâs the most common way to deal with them. Your gun is pathetic. Even after upgrades. Most enemies take tens of shots to bring down. Itâs best to let them do their one attack then counter it. Countering leaves enemies dazed and usually then only takes a single shot to kill. This makes shooting feel flimsy and underpowered. I ended up trying to just avoid enemies if I could, or impatiently waiting for them to do something I could counter. It really ended bringing the game to a crawl. Move a screen forward, counter the enemy, then repeat. A lot of the combat is in boss battles. Just like any of the other versions of Metroid 2, this game features many encounters with metroids in various points in the metroid life cycle. These encounters are by far the best part of this game. I particularly liked the fights against the Zeta and Omega metroids. If there is anything that this game has above AM2R it's these fights. However, thatâs not saying too much because the metroid fights were the weakest part of AM2R. Here, theyâre varied and more interesting. Though theyâre still fairly repetitive and I wish there was even more variation in the arena designs. Especially for the later encounters. I wish I could be as positive about the non-metroid related boss fights, because those are by far the worst part of the game. Thereâs not many. Only about three or four. One in particular was so awful that I had considered putting down the game for good. That boss was this large robot thing that you face late in the game. The fight has the typical hallmarks of poor boss design. Short windows when it can be damaged, long periods (2+ minutes!) where you can do nothing but dodge, and one-off gimmick mechanics that arenât very clear. But there was one thing that was especially egregious. In its second phase it has an inhale attack that is real easy to dodge. You can see it sucking up rocks from the ground and if you try dropping bombs in its path they do nothing. Then in the third phase, it has that same attack, but this time to further the fight you HAVE to drop bombs that then get sucked up and damage it. Thatâs some extremely questionable design! None of the bosses in this game are super challenging. Theyâre all very pattern based and focus completely on having the player recognize those patterns. The cost for any mistake, however, is tons of damage. Itâs not unusual for an attack to drop more than two full tanks of health off you. Fortunately the game checkpoints before every fight. So when you die, you can, after a rather long loading sequence, start the fight over without losing progress.
The abilities in this game, outside of the new Aeion abilities, are pretty standard for a Metroid game. The Aeion ones are fine, but not really used very well. Of the four, I found myself only really using two of them frequently. The other two were completely situational. One thing of note is that almost every ability you get also acts as a key to a particular type of door. âYou need the charge beam to open this door,â etc. Itâs odd. It some ways, it makes sense and prevents sequence breaking (though why would you?). On the other hand, coupled with how underpowered the shooting already is, it makes your weapons feel like just keys. They do find some clever ways to use the grapple beam that as a fan of the series I certainly appreciate though. Then thereâs the upgrades. Missile tank upgrades are a common thing in Metroid games. Typically they end up giving you around 255 missiles if you collect all of them. This game is no different. However, in Super Metroid you would pick up 5 at a time. Meaning there are 51 missile upgrades in that game. In this game, you pick up 3. Meaning there are 85 missile tanks to collect! You pick up so many missile upgrade over the course of this game. Almost all the rewards for exploration are more missile upgrades. You never need more than maybe 100 missiles, and are more likely going to rely on super missiles once you get those as youâre more likely going to counter them instead. Missiles arenât an important thing in this game at all, but man, will you be collecting them.
I canât say that Samus Returns is awful. Itâs not. Itâs playable, but unremarkable. Coming off the heels of AM2R really paints this mediocre game in a much worse light. Nintendo needs to with Metroid what Sega did with Sonic Mania. Give it over to the fans who know what they want rather than making another dull addition. Iâm starting to feel like Super Metroid was lightning in a bottle and theyâve been failing to recapture that ever since. At least with the 2D Metroid games. I have not played the Prime games and have no opinion on them. AM2R is a much superior product and Iâm excited to see what comes from that dev in the future.
Iâm not sure who to recommend this game for. Fans of the series will likely be disappointed. Fans of the genre will find a playable, but uninspired game. Newcomers to both should probably start elsewhere. The price is too high and the game is not worth it. I have not done hard mode nor used any of the Amiibo stuff, so I cannot comment on that. I know Metroid fans have been clamouring for a new 2D Metroid, but they should turn to other developers for the higher quality experience theyâre looking for, as they wonât find it in this game.
As always you can find my as-I-played thoughts on the game in my List of Games Played 2017.
#metroid#metroid: samus returns#review#game review#game recommendations#video games#videogames#video game#videogame#AM2R
1 note
·
View note
Text
In the Paris Open Source Summit, I had a long discussion with engineers from AdaCore, that reminded me of the early roots of XL in Ada. I have never really retraced the steps along the way, and this was an interesting walk for me. I thought Iâd shareâŠ
My interest in programming languages is quite old. Actually, I had been writing development tools for as long as I can remember:
My first published program, at age 14, was an extension of the Sinclair Spectrum BASIC that gave new features such as flood fill or a primitive window system.
A few years later, I published HPDS (HP Development System), a cross-compiler for HP-48 and HP-28 calculators that used an extended version of the built-in language, including a complete assembler (you can already see the Alsys commenting style in this example). HPDS did not have much success, but some of the games I developed with it are still available on the Internet today.
Also, I had been deeply fascinated by the Sinclair QLâs SuperBasic, which had departed so much from regular BASIC that it was probably closer to Pascal.
The direction of my thinking about programming languages changed drastically when, as an engineering student, I met the Ada programming language.
Step 0 (1990s): From Alsys SA to LX
During my last year in engineering school, my internship was at Alsys SA, a company that was producing high-quality Ada compilers, written themselves in Ada. I have very fond memories of that period. I learned quite a bit with the brilliant engineers at that company. The commenting style I still use today is a remnant of that training period.
I learned Ada by reading the reference manual. I remember being very impressed by this language, notably by the standardisation effort. At the time, no two Pascal or C compilers were behaving the same. You had the Turbo Pascal dialect, the Think C dialect which was different from the MPW C, etc⊠I also liked the âsolid feelâ of the Ada language.
Yet, from the very beginning, I felt like Ada was a bit too restrictive. That was the incentive I needed to start thinking about my own programming language.
Step 0bis: WASHB â What Ada Should Have Been
In the very early days, I thought of my language as some enhanced version of Ada. So I called it WASHB, short for What Ada Should Have Been. Obviously, my knack for catchy names and acronyms had not entirely developed back then. That name obviously did not stick for very long, and WASHB was never anything more than a vague specification.
Step 1 (1995): LX (Langage eXpérimental)
The first serious effort at creating a real language was called LX, which were the initials for âLangage eXpĂ©rimentalâ, or âexperimental languageâ in Shakespearian. I donât have much data on the evolution of the language at that time, besides old Word documents that are unreadable on modern versions of Word. In particular, I no longer have source code that I can reliably trace back to that era.
 An LX compiler generating 68K assembly code
I do remember however that I had a compiler that was generating assembly code for the Motorola 68K family of CPUs. I was developing on Atari ST at the time, and I remember testing the generated code on an embedded board with a 68040 back when I was working for the HP Test and Measurement Organisation in France.
That early compiler went far enough to compile âHello Worldâ. I do believe that this was with a standard library written in the language itself.
A language that can build its own standard library
LX was still very Ada-like. Still, as I recall (I no longer have computer records of that period), several ideas had already solidified by that time, which I will expand on later:
Giving up on superfluous syntactic markers such as terminating semi-colon.
Using generics to write standard library component such as arrays or I/O facilities.
Making the compiler an integral part of the language, which led toâŠ
having a normalised abstract syntax tree, andâŠ
considering âpragmasâ as a way to invoke compiler extensions.
I am quite positive about these ideas emerging at the time, because they derived from concerns about Ada having too many âmagic thingiesâ, and these concerns were already well formed while I was working at Alsys.
What I disliked about Ada
I never liked magic in a language. To me, keywords demonstrated a weakness in the language, since they indicated something that you could not build in the library using the language itself. Ada had plenty of keywords and magic constructs.
Let me elaborate a bit on some specific frustrations with Ada:
Tasks in Ada were built-in language constructs. This was inflexible. Developers were already hitting limits of the Ada-83 tasking model. My desire was to put any tasking facility in a library, while retaining an Ada-style syntax and semantics.
Similarly, arrays were defined by the language. I wanted to build them (or, at least, describe their interface) using standard language features such as generics. Use cases I had in mind was interfacing with languages that had different array layouts such as Fortran and C, or using an array-style interface to access on-disk records. Back then, mmap was unavailable on most platforms.
Ada text I/O facilities were uncomfortable. But at that time, there was no good choice, and it was mostly a game of picking the poison that would kill you:
In Pascal, WriteLn could take as many arguments as you needed and was type safe, but it was a magic procedure in Pascal, that you could not write yourself in standard Pascal, nor extend or modify to suit your needs.
Adaâs text I/O facilities only took one argument at a time, which made writing the simplest I/O statement quite tedious relative to C or Pascal.
Câs printf statement had multiple arguments, but was neither type safe nor extensible, and the formatting string was horrid.
I also did not like pragmas, which I found too ad-hoc, with a verbose syntax. I saw pragmas as indicative that some kind of generic âlanguage extensionâ facility was needed, although it took me a while to turn that idea into a reality.
From experimental to extensible
I soon realised that my efforts were mostly about being able to extend the language through its standard library and mechanisms such as pragmas. At some unspecified point in time, somewhere along the way, the meaning of LX changed from experimental language to extensible language. I liked the normalisation of Ada, but I wanted a way to leverage the base language to go beyond the basics in a controlled and specified way.
The development period for LX lasted between my training period at Alsys in 1990 and 1998, when I jointed the HP California Language Lab in Cupertino (CLL) to work on the C++ compiler and, I hoped, my own language. It did not all go as plannedâŠ
Step 2 (1998): XL, meet Xroma
One of the very first things I did moving to the US was to translate the language name to English. So LX turned into XL. This was a massive rename in my source code, but everything else remained the same.
Daveed Vandevoorde and meta-programming
As soon as I joined the CLL, I started talking about my language and the ideas within. One CLL engineer who immediately âgot itâ is Daveed Vandevoorde. Daveed immediately understood what I was doing, in large part because he was thinkering along the same lines. He pointed out that my approach had a name: meta-programming, i.e. programs that deal with programs. I was doing meta-programming without knowing about the word, and I felt really stupid at the time, feeling that everybody knew about that technique but me.
Daveed was very excited about my work, because he was himself working on his own pet language named Xroma (pronounced like Chroma). At the time, Xroma was, I believe, not as far along as XL, since Daveed had not really worked on a compiler. However, it had annotations similar to my pragmas (which I suspect are distant ancestors of C++11 attributes), and some kind of public representation for the abstract syntax tree as well.
Also, the Xroma name was quite Xool, along with all the puns we could build using a capital-X pronounced as âKâ (Xolor, Xameleon, Xode, âŠ) or not (Xform, Xelerate, âŠ). As a side note, I later called âXmogrificationâ the VM context switch in HPVM, probably in part as a residual effect of the Xroma naming conventions.
In any case, Daveed and I joined forces, that combined effort was named Xroma. I came up with the early version of the lightbulb logo, and Daveed did a nice 3D rendering of the same using the Persistence of Vision ray tracer.
Concept programming
The discussions around our respective languages, including the meta-programming egg-face moment, led me to solidify the theoretical underpinning of what I was doing with XL. It actually did go somewhat beyond meta-programming, which was really only a technique being used, but not the end goal. I called my approach Concept Programming. I tried to explain what it is about in this presentation.
Concept programming is about how we transform concepts that reside in our brain into code that resides in the computer. That conversion is lossy, and concept programming explores various techniques to limit the losses. It introduces pseudo-metrics inspired by signal processing such as syntactic noise, semantic noise, bandwidth and signal/noise ratio. More importantly, Concept Programming has consistently guided what I am doing with XL.
From examples of concept programming to C++ concepts
As an aside, talking about concepts, I have reasons to suspect that C++ concepts might be the result of a mis-interpretation of several concept-programming e-mail discussions I had with a few C++ committee members around the year 2000. Maybe Iâm wrong and itâs just a coincidence. Besides the emails I sent, there is some striking similarity.
Consider the following concept example from Wikipedia:
template <class T> concept bool EqualityComparable() { return requires(T a, T b) { {a == b} -> Boolean; // Boolean is the concept defining a type usable in boolean context {a != b} -> Boolean; }; }
To me, it looks very much like a C++ version of this example (called validated generics in XL), which I was using as an illustration of the outcomes of concept programming in practically any discussion of these topics back in 2000:
generic type ordered where A, B : ordered Test : boolean := A < B
As of this writing (December 2017), C++ concepts did not yet make it to the C++ standard. By contrast, a variant of the code above shows up on a web page that states âFirst published on February 17, 2000â. Whether reinvented independently or not, this specific idea, that I will keep calling validated generics, took about 20 years to make it to C++.
The one thing that makes me unhappy about C++ concepts is the name concepts. In Concept Programming, concepts reside in your head, never in the code. So calling something in the code âconceptâ is bound to make concept programming much harder to explain to developers who learned C++ concepts first.
A program database to rule them all
Daveed is still a prominent and innovative member of the C++ community today. Back in the CLL days, he was already quite influential, being for example the HP representative to the C++ Standard committee.
He quickly generated quite a bit of interest in the CLL about some kind of universal program database that would represent the program in such a way that various tools could work on it. We called such transformations thin tools. The database format was intended to work across languages, so it was designed to be able to represent a C++ parse tree or a Java parse tree in a very similar way. There was tremendous interest about this kind of technology at the time, and like many things, these ideas took years to materialise elsewhere.
Daveed and I gave several internal talks about our ideas, and I was happy to let him speak if only because my english accent at the time was much worse than his. That proved to be a mistakeâŠ
Switching to the off-side rule
Another major visual change that happened around that time was switching to the off-side rule, i.e. using indentation to mark the syntax. Python, which made this approach popular, was at the time a really young language (release 1.0 was in early 1994).
Alain Miniussi, who made a brief stint at the CLL, convinced me of giving up the Ada-style begin and end keywords. I was initially totally unconvinced, and reluctantly tried it on a fragment of the standard library. As soon as I tried it, however, the benefits immediately became apparent. It was totally consistent with the core idea of concept programming, namely that the code should look like your concepts. Enforcing indentation made sure that the code did look like what it meant.
It took some effort to convert existing code, but Iâve never looked back since then.
Daveed leaves HP and XL loses traction
Not very long after I joined the CLL, Daveed received one of these offers that you canât refuse from the Edison Design Group, the company heâs still working for to this day. So he left HP. Before leaving, he asked me to keep the Xroma name for his own project, and together, we decided that I would rename my side as Mozart. I was disappointed, because I liked the name Xroma, but did not think too much of it.
However, it quickly became evident that after Daveedâs departure, XL had lost all traction in the CLL. The next talk I organised about it had maybe two people. Clearly, CLL engineers saw Xroma as being solely Daveedâs baby, with me just tagging along. It was obvious that everybody thought the project had died the day Daveed left HP.
Trying Apple, unsuccessfully
After a few months without any traction at HP, I started looking elsewhere. Through some chance event, I ended up pitching XL to Bertrand Serlet at Apple. I rarely met someone so smart. I remember being vastly impressed that the second or third question he asked was âHow do you deal with introspectionâ. I did not know at the time how important introspection was in the Objective-C programming model. That question, coming from a higher-up at Apple, was totally unexpected. It convinced me that Apple knew what they were talking about.
In any case, Bertrand Serlet was convinced enough that I interviewed with the compiler team at Apple. Things went quite well, until someone asked âAre you sure you own this language?â I personally had absolutely no doubt about that, having invented that language long before even joining HP. However, the Apple engineer who had asked the question was quick to point out that if I had spoken about it to HP engineers, California Work for Hire agreements probably meant that HP had taken ownership of my work.
So the discussion with Apple stopped abruptly, and I returned to HP somewhat bummed. I still did a few things at the CLL, like porting the HP C++ compiler to Itanium or representing HP at the C++ committee for a couple of years. But I had no real love for C++, and I hit a career wall within the CLL. I finally left the C++ compiler team after only two years, and started HP Integrity Virtual Machines. But that is another story.
Step 3 (2000): Open-sourcing XL and Mozart
Before leaving the CLL, and following the Apple incident, I talked with various people at HP to make sure my intellectual property of XL was acknowledged.
It took quite a bit of time, but we reached an agreement as follows: XL and Mozart had to be made open-source, and published regularly so that HP could monitor whether they were interested in it. Also, because HP had a strong interest in Java at the time, they wanted to make sure that whatever I developed also worked for Java.
Moka, a Java to Java compiler
So I published Mozart and began working actively on it, making sure there was some prominent Java support in it. I also published an article in Dr Dobbâs, a popular developer journal.
But my heart was never with Java anymore than with C++, as evidenced by the much more extensive documentation about XL on the Mozart web site. As a language, Java had very little interest for me.
Key innovations in 2000-vintage XL
By that time, XL was already quite far away from the original Ada. Here are some of the key features that went quite a bit beyond Ada:
The syntax was quite clean, with very few unnecessary characters. There were no semi-colons at the end of statement, and parentheses were not necessary in function or procedure calls, for example. The off-side rule I talked about earlier allowed me to get rid of any begin or end keyword, without resorting to C-style curly braces to delimit blocks.
Pragmas extended the language by invoking arbitrary compiler plug-ins. As I already pointed out, I suspect that attributes in C++11 are distant (and less powerful) descendants of this kind of annotation, if only because their syntax matches my recollection of the annotation syntax in Xroma.
Expression reduction was a generalisation of operator overloading that works with expressions of any complexity. To this day, expression reduction still has no real equivalent in any other language that I know of, although expression templates can be used to achieve similar effect in a very convoluted way for expressions following the standard operator syntax in C++.
True generic types were a way to make generic programming much easier by declaring generic types that behaved like regular types. Validated generic types extended the idea by adding a validation to the type, and they also have no real equivalent in other languages that I am aware of, although C++ concepts bring a similar kind of validation to C++ templates.
Type-safe variable argument lists made it possible to write type-safe variadic functions. They solved the WriteLn problem I referred to earlier, i.e. they made it possible to write a function in a library that behaved exactly like the Pascal WriteLn. I see them as a distant ancestor of variadic templates in C++11, although like for concepts, it is hard to tell if variadic templates are a later reinvention of the idea, or if something of my e-mails influenced members of the C++ committee.
A powerful standard library was not quite there, but the key foundations were there, and it was mostly a matter of writing it. My implementation of complex numbers, for example, was 70% faster than C++ on simple examples, because it allowed everything to be in registers instead of memory.
There were a few things that I believe also date from that era, like getting rid of any trace of a main function, top-level statements being executed as in most scripting languages.
Limitations of the parse tree representation
One thing did not work well with Mozart, however, and it was the parse tree representation. That representation, called Notes, was quite complex. It was some kind of object-oriented representation with many classes. For example, there was a class for IfThenElse statements, a Declaration class, and so on.
This was quite complex, and made it extremely difficult to write thin tools, in particular thin tools that respected subtle semantic differences between languages. By 2003, I was really hitting a wall with XL development, and that was mostly because I was also trying to support the Java language which I did not like much.
One of the final nails in the Mozart coffin was a meeting with Alan Kay during an HP technical conference (he was an HP Fellow at the time). I tried to show him how my language was solving some of the issues he had talked about during his presentation. He did not even bother looking. He simply asked: âDoes your language self-compile?â. When I answered that the compiler being written in C++, Alan Kay replied that he was not interested.
That gave me a desire to consider a true bootstrap of XL. That meant rewriting the compiler from scratch. But at that time, I had already decided that the internal parse tree representation needed to be changed. So that became my topic of interest.
Step 4 (2003): Bootstrapping XL2
The new implementation was called XL2, not as a version number, but because I was seeing things as a three-layer construction:
XL0 was just a very simple parse tree format with only eight node types.
XL1 was the core language, without any library.
XL2 was the full language, including its standard library.
This language is still available today, and while itâs not been maintained in quite a while, it seems to still pass most of its test suite. More importantly, the XL0 format has remained unchanged since then.
The XL0 parse tree format
The parse tree format is something that makes XL absolutely unique among high-level programming languages. It is designed so that code that can look and feel like an Ada derivative can be represented and manipulated in a very simple way, much like Lisp lists are used to represent programs.
The picture below shows an example of XL source code on the left, along with a graphical rendering of the internal representation on the right (click for details):
The parse tree format consists of only eight node types, four leaf node types (integer, real, text and symbol), four inner node types (infix, prefix, postfix and block). It is very vaguely documented here.
A few surprising properties of this parse tree format is that individual program lines are seen as the leaves of an infix ânewlineâ operator. There are no keywords at all, the precedence of all operators being given dynamically by a syntax file.
Bootstrapping XL
The initial translator converts a simplified form of XL into C++ using a simplified form of transcoding. The simplified form of XL2 acceptable as input for this translation phase is only used in the bootstrap compiler. It already looks a bit like the final XL2, but error checking and syntax analysis are practically nonexistent.
The bootstrap compiler can then be used to translate the native XL compiler. The native compiler performs much more extended semantic checks, for example to deal with generics or to implement a true module system. It emits code using a byte-code that is converted to a variety of runtime languages. For example, the C bytecode file will generate a C program, turning the native compiler into a transcoder from XL to C.
That native compiler can translate itself, which leads to a true bootstrap where the actual compiler is written in XL, even if a C compiler is still used for the final machine code generation.
The XL2 compiler advanced to the point where it could pass a fairly large number of complex tests, including practically all the things that I wanted to address in Ada:
Pragmas implemented as compiler plug-ins.
Expression reduction generalising operator overloading.
An I/O library that was as usable as in Pascal, but written in the language and user-extensible.
A language powerful enough to define its own arrays or pointers, while keeping them exactly as usable as built-in types.
Compiler plugins
XL2 had full support for compiler plug-ins, in a way similar to what had been done with Mozart. However, plug-ins were much simpler to develop and maintain, since they had to deal with a very simple parse tree structure.
For example, the differentiation plugin implements symbolic differentiation for common functions. It is tested here. The generated code after applying the plugin would look like this. The plugin itself is quite simple. It simply applies basic mathematical rules on parse trees. For example, to perform symbolic differentiation on multiplications, the code looks like this:
function Differentiate (expr : PT.tree; dv : text) return PT.tree is translate expr when ('X' * 'Y') then dX : PT.tree := Differentiate(X, dv) dY : PT.tree := Differentiate(Y, dv) return parse_tree('dX' * 'Y' + 'X' * 'dY')
Meta-programming became almost entirely transparent here. The translate statement, itself provided by a compiler plug-in (see below), matches the input tree against a number of shapes. When the tree looks like X*Y, the code behind the matching then is evaluated. That code reconstructs a new parse tree using the parse_tree function.
Also notice the symmetric use of quotes in the when clause and in the parse_tree function, in both cases to represent variables as opposed to names in the parse tree. Writing parse_tree(X) generates a parse tree with the name X in it, whereas parse_tree('X') generates a parse tree from the X variable.
Translation extension
A particularly important compiler extension provided the translation and translate instructions. Both were used extensively to rewrite XL0 parse trees easily.
We saw above an example of translate, which translated a specific tree given as input. It simply acted as a way to compare a parse tree against a number of forms, evaluating the code corresponding to the first match.
The translation declaration was even more interesting, in that it was a non-local function declaration. All the translation X from all modules were accumulated in a single X function, several functions corresponding to distinct phases in the compiler. This made it possible to distribute translation XLDeclaration statements throughout the compiler, dealing with declaration of various entities, with matching translation XLSemantics for the later semantics analysis phase.
This approach made it quite easy to maintain the compiler over time. It also showed how concept programming addressed what is sometimes called aspect-oriented programming.
Step 5 (2009): Dynamic code generation
One issue I had with the original XL2 approach is that it was strictly a static compiler. The bytecode files made it possible to generate practically any language as output. I considered generating LLVM bitcode, but thought that it would be more interesting to use an XL0 input instead. One reason to do that was to be able to pass XL0 trees around in memory without having to re-parse them. Hence XLR, the XL runtime, was born.
XLR, the functional variant of XL
For various reasons, I wanted XLR to be dynamic, and I wanted it to be purely functional. My motivations were:
a long-time interest in functional languages.
a desire to check that the XL0 representation could also comfortably represent a functional languages, as a proof of how general XL0 was.
an intuition that sophisticated type inference, Haskell-style, could make programs both shorter and more solid than the declarative type systems of Ada.
While exploring functional languages, I came across Pure, and that was the second big inspiration for XL. Pure prompted me to use LLVM as a final code generator, and to keep XLR extremely simple.
Translating using tree rewrites
As a matter of fact, I sometimes describe XLR as a language with a single operator, ->, which reads as transforms into. Thus, X->0 declares a variable X. This notation can be used to declare basic operators:
x:integer - y:integer as integer -> opcode Sub
It makes a declaration of writeln even shorter than it was before:
write x:text as boolean -> C elfe_write_text write x:integer as boolean -> C elfe_write_integer write x:real as boolean -> C elfe_write_real write x:character as boolean -> C elfe_write_character writeln as boolean -> C elfe_write_cr
More interestingly, even if-then-else can be described that way:
if true then TrueBody else FalseBody -> TrueBody if false then TrueBody else FalseBody -> FalseBody if true then TrueBody -> TrueBody if false then TrueBody -> false
Similarly for basic loops, provided your translation mechanism implements tail recursion properly:
while Condition loop Body -> if Condition then Body while Condition loop Body until Condition loop Body -> while not Condition loop Body loop Body -> Body; loop Body for Var in Low..High loop Body -> Var := Low while Var < High loop Body Var := Var + 1
Note that the fact that such structures can be implemented in the library does not mean that they have to. It is simply a proof that basic amenities can be constructed that way, and to provide a reference definition of the expected behaviour.
Step 6 (2010): Tao3D
When I decided to leave HP, I thought that XLR was flexible enough to be used as a dynamic document language. I quickly whipped together a prototype using XLR to drive an OpenGL 3D rendering engine. That proved quite interesting.
Over time, that prototype morphed into Tao3D. As far as the XLR language itself is concerned, there wasnât as much evolution as previously. A few significant changes related to usability popped up after actively using the language. For example, implicit conversions of integer to real were not in the original XLR, but it was quite annoying in practice when providing object coordinates.
Tao3D developed a relatively large set of specialised modules, dealing with things such as stereoscopy or lens flares. As a product, however, it was never very successful, and Taodyne shut down in 2015.
Step 7 (2015): ELFE
ELFE is another application of XLâs extensibility to another application domain, namely distributed software. The idea was to take advantage of the existence of the XL0 standard parse tree to communicate programs and data across machines.
An ELFE program looks as as if it was running on a single machine, but actively exchanges program segments and their associated data between distant nodes. ELFE only adds a very small number of features to the standard XL:
The ask statement sends a program, and returns the result of evaluating that program as if it has been evaluated locally. It works like a remote function call.
An invoke statement sends a program to a remote node. Itâs a âfire and forgetâ operation, but leaves a reply channel open while itâs executing.
Finally, the reply statement allows a remote node to respond to whoever invokeâd it, by evaluating one of the available functions in the callerâs context.
A few very simple ELFE demos illustrate these remote-control capabilities. For example, itâs easy to monitor temperature on two remote sensor nodes, and to ask them to report if their temperatures differ by more than some specified amount.
ELFE was designed to run with a small memory footprint, so it provides a complete interpreter that does not require any LLVM. On the other hand, the LLVM support in that âbranchâ of the XL family tree fell into a bad state of disrepair.
And today?
These days, I find myself with several subtly distinct variants of XL which all share the same XL0, but have very different run-time constraints.
Languages using the XL0 parse tree format
Languages currently using the same XL0 parse tree format include;
Tao3D has the most advanced library, and a lot of code written for it. But that code often depends on undesirable behaviours in the language, such as implicit by reference argument passing.
ELFE has the most advanced type system of all variants, being able to perform overloading based on the shape of parse trees, and having a rather complete set of control structures implemented in the library. It also has an interesting modular structure, and a rather full-featured interpreter.
XLR fell behind with respect to LLVM support, LLVM not being particularly careful about release-to-release source compatibility. On the other hand, it had the most advanced type inference system, which allowed it to get performance that was close to C for simple cases.
XL2 has been left aside for a few years, but is still all but obsolete. It would need a bit of love to make progress with the standard library and actually connect the XLR back-end as initially envisioned.
Re-converging?
Overall, the effort today is to re-converge these various branches and to catch-up with LLVM. The ideal converged solution would have:
The modular structure, interpreter, remote control capabilities and type system developed for ELFE.
The real-time graphic capabilities of Tao3D, probably offered as an ELFE-style module
A finished Haskell-style type inference system
An XL2 front-end for those who prefer an imperative programming style
Future ideas
In addition, I have been toying with a few ideas for a while:
Using is instead of -> as the as the one-and-only rewrite operator. I believe that for most programmers, X is 0 is immediately understandable, whereas the current X->0 requires an explanation.
Replacing the block node type with an sequence or array node type.
Currently, blocks without a content, such as ( ) or { }, have a blank name inside, which I find ugly. It would make more sense to consider them as arrays with zero length.
Furthermore, blocks are often used to hold sequences, for example sequences of instructions. It would be easier to deal with a block containing a sequence of instructions than with the current block containing an instruction or a chain of infix nodes.
Bootstrapping an XLR compiler or interpreter, to validate that the XLR-level language is good enough for a compiler.
Conclusion
This article was way too long.
  From Ada to XL in 25+ years⊠In the Paris Open Source Summit, I had a long discussion with engineers from AdaCore, that reminded me of the early roots ofâŠ
0 notes