Tuesday, January 29, 2008

Automated documentation code testing

During the first year or so of development work on Crunchy, I probably got a nickname of "Dr. NO!" by early Crunchy adopters as I often resisted suggestions for adding new capabilities. At the time, Crunchy required that html pages have additional markup added (vlam = very little additional markup) so that Crunchy could process them properly. I wanted Crunchy-enabled tutorials to be very easily created, without much additional work from tutorial writers, so that Crunchy would be adopted by many people. Most of the suggestions that were made, including some by Johannes, both while he was sponsored by Google as a Summer of Code student and afterwards when he became a co-developer, were rejected by me for that reason. Since then, the situation has changed, mainly for two reasons:
  1. Johannes created a new basic infrastructure for Crunchy where we can introduce new capabilities via plugins, without needing to change a single line of the core in most instances.
  2. Based on the new architecture, I came up with a new way to process pages so that no additional markup was needed for Crunchy to do its magic. This is what makes it possible, for example, to interact with the official Python tutorial on the python.org site.
Now, that it is so easy to implement new capabilities, I am revisiting some ideas I had rejected or ignored before. The struggle I have is to decide when enough is enough before finally having a version 1.0 officially released.

In any event, after reading some comments on this post by Georg Brandl, I started thinking about adding a new option to test code embedded in documentation. To quote from the comments on that post:

One thing that is seriously needed is the ability to run and test code snippets in some fashion. It's just too easy for documentation to get out of date relative to the code, and if you can effectively "unit test" your docs, you're in much better shape.

And I don't mean like doctests, because not everything lends it self well to that style of testing. If it's possible to mark up some code as being for test fixture and some code as being what belongs in the doc, that would be good.
Alternatively, from another reader:

For me a key is being able to test code in the docs, and think the key is being able to "annotate" a code snipit with information about the context in which it should run, and the output it should give.

I think that Crunchy is a very good platform to implement this. There are currently three complementary options I am considering, one of which I have started to implement.


The first option is to have something like the following [note that while I use html notation, Crunchy is now capable of handling reStructuredText, including having the possibility of dealing with additional directives]:

Some normally hidden code, used for setup:
<pre title="setup_code name=first">
a=42
</pre>

Followed by the code sample to be tested:
<pre title="check_code name=first">
print a
</pre>

And the expected output:
<pre title="code_output name=first">
42
</pre>

Upon importing a document containing such examples, Crunchy would insert a button for each code sample allowing the user to test the code by clicking on the button, invoking the appropriate setup, and comparing with the expected output. Alternatively, all such code samples in a document could be run by a single click on a button inserted at the top of a page. A javascript alert could be used to inform the user that all tests passed - otherwise error messages could be inserted in the page indicating which tests failed or passed.

This type of approach could, in theory, be used for other languages than Python; code could be executed by passing information to a separate process launched in a terminal window, with the result fed back into Crunchy as described above.

A second approach is to use the same method used by doctest to combine code sample and expected output; the setup code could still be used as described above.

A third approach, this one completely different, could be used for more general situation than simply for documentation code testing.

Currently, the Python code needs to be embedded inside an html (or rst) document. However, one could create links to code that lives inside separate Python files. For example, one could have the following:

<pre title="python_file">
<span title="python_file_name"> file_path </span>
<span title="python_file_linenumbers"> some_range </span>
</pre>

When viewing the above using a normal browser, one would see something like (using a fictitious example)

../crunchy_dir/crunchy.py
[1-3, 5, 7, 10-15]
However, when viewing the same page with Crunchy, the appropriate lines would be extracted from the file and displayed in the browser. Alternatively, instead of specifying the line numbers, one could have a directive to extract a specific function/method/class as in

<span title="python_file_function"> function_name </span>

which would instruct Crunchy to extract all the code for the function definition, and inserting it in the document. By using such links, the code in the documentation would always (by definition) be kept in sync with the real code. I realize that this is not exactly a novel idea but one whose potential could be extended by using Crunchy in ways never seen before. However, this last approach will have to wait until after Crunchy version 1.0 has been released.

What do you think of these ideas?

2 comments:

Paddy3118 said...

I can't help but feel that maybe its not going in the right way André.
I thought that Crunchy would become the way to animate textual tutorials. If that is done then maybe more effort could be spent on Crunchys marketing?

André Roberge said...

I am not sure what you mean both when you write that it's not going the right way and when you write that you thought that Crunchy would become the way to animate textual tutorials.

Currently, afaik, there are no other tool that animates tutorials - so it's not a matter of competition.

Because Crunchy is so easily extensible, it makes sense (to me at least) to try and extend it in various ways, even if only as a proof-of-concept, to see what new features could be useful for people.

I am not so much interested in "marketing Crunchy" as I am in "marketing Python".